text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
What is quantum computing and why does the future of Earth depend on it? Computing power is reaching a crisis point. If we continue to follow the trend in place since computers were introduced, by 2040, we will not have the capability to power all of the world’s machines, unless we can crack quantum computing. Quantum computers promise faster speeds and more robust security than their classical counterpart, and scientists have been striving to create a quantum computer for decades. What is quantum and how does it help us? Quantum computing differs from classical computing in one fundamental way—the way information is stored. Quantum computing makes the most of a strange property of quantum mechanics, called superposition. It means one ‘unit’ can hold much more information than the equivalent found in classical computing. Information gets stored in ‘bits’ in state ‘1‘ or ‘0,’ like a light switch that turns on or off. By contrast, quantum computing can include a unit of information that can be ‘1,’ ‘0,’ or a superposition of the two states. Think of a superposition as a sphere. ‘1‘ is written at the north pole, and ‘0‘ is written at the south—two classical bits. However, a quantum bit (or qubit) can be found anywhere between the poles. “Quantum bits that can be on and off at the same time, provide a revolutionary, high-performance paradigm where information is stored and processed more efficiently,” said Dr. Kuei-Lin Chiu to Alphr in 2017. Dr. Chiu was a researcher for the quantum mechanical behavior of materials at the Massachusetts Institute of Technology. The ability to store a much higher amount of information in one unit means quantum computing can be faster and more energy-efficient than computers we use today. So why is it so hard to achieve? Qubits, the backbone of a quantum computer, are tricky to make and, once established, are even harder to control. Scientists must get them to interact in specific ways that would work in a quantum computer. Researchers have tried using superconducting materials, ions held in ion traps, individual neutral atoms, and molecules of varying complexity to build them. However, making them hold onto quantum information for a long time is proving difficult. In recent research, scientists at MIT devised a new approach, using a cluster of simple molecules made of just two atoms as qubits. “We are using ultracold molecules as ‘qubits’” Professor Martin Zwierlein, lead author of the paper, told Alphr in 2017. “Molecules have long been proposed as a carrier of quantum information, with very advantageous properties over other systems like atoms, ions, superconducting qubits, etc. “Here, we show for the first time, that you can store such quantum information for extended periods in a gas of ultracold molecules. Of course, an eventual quantum computer will have to also make calculations, for example, have the qubits interact with each other to realize so-called “gates.” Zwierlein continued, “But first, you need to show that you can even hold on to quantum information, and that’s what we have done.” The qubits created at MIT held onto the quantum information longer than previous attempts, but still only for one second. This timeframe might sound short, but it is “in fact on the order of a thousand times longer than a comparable experiment that has been done,” explained Zwierlein. More recently, researchers from the University of New South Wales made a significant breakthrough in the push towards quantum computing. They invented a new type of qubit called a flip-flop qubit, which uses the electron and the nucleus of a phosphorus atom. They are controlled by an electrical signal instead of a magnetic one, making them easier to distribute. The ‘flip-flop’ qubit works by pulling the electron away from the nucleus using an electric field, creating an electric dipole. It is not just qubits, however, that scientists need to figure out. They also need to determine the material to make quantum computing chips successfully. Chiu’s paper, published earlier in 2017, found ultra-thin layers of materials that could form the basis for a quantum computing chip. Chiu said to Alphr, “The interesting thing about this research is how we choose the right material, find out its unique properties, and use its advantage to build a suitable qubit.” “Moore’s Law predicts that the density of transistors on silicon chips doubles approximately every 18 months,” Chiu told Alphr. “However, these progressively shrunken transistors will eventually reach a small scale where quantum mechanics play an important role.” Moore’s Law, which Chiu referred to, is a computing term developed by Intel co-founder Gordon Moore in 1970. It states that the overall processing power for computers doubles about every two years. As Chiu states, the chips’ density decreases—a problem that quantum computing chips can potentially answer. Is quantum computing the ultimate vaporware? What is vaporware? In case you never heard of the term vaporware, it is essentially a software-related product that is advertised but not yet available or possibly never becomes available. An example is a software product that was heavily marketed but never saw the light of day. Despite people making optimistic predictions for decades about the impact of quantum computers, and the various advancements in business and research environments, how close are we to achieving the dream of quantum computing? Is this situation a prediction of future vaporware, or will it become something of use? We delve into the reality of quantum computing in another article. In summary, a quantum computer will likely perform a very unrealistic computation faster than a conventional computer in the next year or two. However, it won’t be a straightforward process, and it won’t be cheap or beneficial for everyday consumers.
<urn:uuid:18770fd0-9d5d-4dcb-bdfa-77c5c3ed61d8>
CC-MAIN-2021-10
https://www.alphr.com/technology/1006491/what-is-quantum-computing-and-why-does-the-future-of-earth-depend-on-it
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178358064.34/warc/CC-MAIN-20210227024823-20210227054823-00544.warc.gz
en
0.93788
1,274
3.6875
4
Moving quantum computation from the labs and into the real world will require more precise ways to measure performance. Researchers at the Department of Energy’s Oak Ridge National Laboratory have taken a step in that direction by developing a quantum chemistry simulation benchmark to evaluate the performance of quantum devices and guide the development of applications for future quantum computers, according to a news release. Quantum computers use the laws of quantum mechanics and units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional “bits” have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, allowing for a vast number of possibilities for storing data. While still in their early stages, quantum systems have the potential to be exponentially more powerful than today’s leading classical computing systems and promise to revolutionize research in materials, chemistry, high-energy physics, and across the scientific spectrum. But because these systems are in their relative infancy, understanding what applications are well suited to their unique architectures is considered an important field of research. “We are currently running fairly simple scientific problems that represent the sort of problems we believe these systems will help us to solve in the future,” said ORNL’s Raphael Pooser, principal investigator of the Quantum Testbed Pathfinder project. “These benchmarks give us an idea of how future quantum systems will perform when tackling similar, though exponentially more complex, simulations.” Pooser and his colleagues calculated the bound state energy of alkali hydride molecules on 20-qubit IBM Tokyo and 16-qubit Rigetti Aspen processors. These molecules are simple and their energies well understood, allowing them to effectively test the performance of the quantum computer. By tuning the quantum computer as a function of a few parameters, the team calculated these molecules’ bound states with chemical accuracy, which was obtained using simulations on a classical computer. Of equal importance is the fact that the quantum calculations also included systematic error mitigation, illuminating the shortcomings in current quantum hardware. Systematic error occurs when the “noise” inherent in current quantum architectures affects their operation. Because quantum computers are extremely delicate (for instance, the qubits used by the ORNL team are kept in a dilution refrigerator at around 20 millikelvin (or more than -450 degrees Fahrenheit), temperatures and vibrations from their surrounding environments can create instabilities that throw off their accuracy. For instance, such noise may cause a qubit to rotate 21 degrees instead of the desired 20, greatly affecting a calculation’s outcome. “This new benchmark characterizes the ‘mixed state,’ or how the environment and machine interact, very well,” Pooser said. “This work is a critical step toward a universal benchmark to measure the performance of quantum computers, much like the LINPACK metric is used to judge the fastest classical computers in the world.” While the calculations were fairly simple compared to what is possible on leading classical systems such as ORNL’s Summit, currently ranked as the world’s most powerful computer, quantum chemistry, along with nuclear physics and quantum field theory, is considered a quantum “killer app.” In other words, it is believed that as they evolve quantum computers will be able to more accurately and more efficiently perform a wide swathe of chemistry-related calculations better than any classical computer currently in operation, including Summit. “The current benchmark is a first step towards a comprehensive suite of benchmarks and metrics that govern the performance of quantum processors for different science domains,” said ORNL quantum chemist Jacek Jakowski. “We expect it to evolve with time as the quantum computing hardware improves. ORNL’s vast expertise in domain sciences, computer science and high-performance computing make it the perfect venue for the creation of this benchmark suite.” ORNL has been planning for paradigm-shifting platforms such as quantum for more than a decade via dedicated research programs in quantum computing, networking, sensing and quantum materials. These efforts aim to accelerate the understanding of how near-term quantum computing resources can help tackle today’s most daunting scientific challenges and support the recently announced National Quantum Initiative, a federal effort to ensure American leadership in quantum sciences, particularly computing. Such leadership will require systems like Summit to ensure the steady march from devices such as those used by the ORNL team to larger-scale quantum systems exponentially more powerful than anything in operation today. Access to the IBM and Rigetti processors was provided by the Quantum Computing User Program at the Oak Ridge Leadership Computing Facility, which provides early access to existing, commercial quantum computing systems while supporting the development of future quantum programmers through educational outreach and internship programs. Support for the research came from DOE’s Office of Science Advanced Scientific Computing Research program. “This project helps DOE better understand what will work and what won’t work as they forge ahead in their mission to realize the potential of quantum computing in solving today’s biggest science and national security challenges,” Pooser said. Next, the team plans to calculate the exponentially more complex excited states of these molecules, which will help them devise further novel error mitigation schemes and bring the possibility of practical quantum computing one step closer to reality. UT-Battelle manages ORNL for DOE’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.
<urn:uuid:c99a1221-9b63-4008-a178-fc0a79aa2363>
CC-MAIN-2021-10
https://thequantumdaily.com/2020/01/09/researchers-develop-benchmark-to-better-measure-quantum-performance/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178360853.31/warc/CC-MAIN-20210228115201-20210228145201-00507.warc.gz
en
0.92759
1,156
3.671875
4
When it comes to studying transportation systems, stock markets and the weather, quantum mechanics is probably the last thing to come to mind. However, scientists at Australia’s Griffith University and Singapore’s Nanyang Technological University have just performed a ‘proof of principle’ experiment showing that when it comes to simulating such complex processes in the macroscopic world quantum mechanics can provide an unexpected advantage. Griffith’s Professor Geoff Pryde, who led the project, says that such processes could be simulated using a “quantum hard drive”, much smaller than the memory required for conventional simulations. “Stephen Hawking once stated that the 21st century is the ‘century of complexity’, as many of today’s most pressing problems, such as understanding climate change or designing transportation system, involve huge networks of interacting components,” he says. “Their simulation is thus immensely challenging, requiring storage of unprecedented amounts of data. What our experiments demonstrate is a solution may come from quantum theory, by encoding this data into a quantum system, such as the quantum states of light.” Einstein once said that “God does not play dice with the universe,” voicing his disdain with the idea that quantum particles contain intrinsic randomness. “But theoretical studies showed that this intrinsic randomness is just the right ingredient needed to reduce the memory cost for modelling partially random statistics,” says Dr Mile Gu, a member of the team who developed the initial theory. In contrast with the usual binary storage system – the zeroes and ones of bits – quantum bits can be simultaneously 0 and 1, a phenomenon known as quantum superposition. The researchers, in their paper published in Science Advances, say this freedom allows quantum computers to store many different states of the system being simulated in different superpositions, using less memory overall than in a classical computer. The team constructed a proof-of-principle quantum simulator using a photon – a single particle of light – interacting with another photon. The data showed that the quantum system could complete the task with much less information stored than the classical computer– a factor of 20 improvements at the best point. “Although the system was very small – even the ordinary simulation required only a single bit of memory – it proved that quantum advantages can be achieved,” Pryde says. “Theoretically, large improvements can also be realised for much more complex simulations, and one of the goals of this research program is to advance the demonstrations to more complex problems.” Receive an email update when we add a new ARTIFICIAL SYNAPSES article. The Latest on: Quantum hard drive via Google News The Latest on: Quantum hard drive - Air New Zealand Limited (ANZFF) CEO Greg Foran on 2021 Interim Results - Earnings Call Transcripton February 25, 2021 at 3:00 pm Interim Results Earnings Conference Call February 24, 2021, 04:00 PM ET Company Participants Leila Peters - General Manager, Corporate ... - A curious observer’s guide to quantum mechanics, pt 7: The quantum centuryon February 21, 2021 at 6:00 am One of the quietest revolutions of our current century has been the entry of quantum mechanics into our everyday technology. It used to be that quantum effects were confined to physics laboratories ... - Data Storage Market Size, Share, COVID-19 Impact, New Technological Advancements And Geographical Forecast Till 2028on February 18, 2021 at 10:42 am The growing demand for cloud drives owing to enormous data collection is expected to foster the growth of the market, states Fortune Business Insights in a report, titled “ Data StorageMarket Size, ... - Managing Encryption for Data Centers Is Hard. And It Will Get Harderon February 16, 2021 at 7:09 am With multi-cloud already here and quantum computing on the horizon, most should probably leave encryption to the experts. - Everything You Wanted to Know about Quantum Computingon February 11, 2021 at 4:00 pm DN: What about storage? Is there a quantum equivalent to that? Can we imagine a future when people will be trading their mechanical and solid-state hard drives for some sort of quantum drive? IS: Some ... - This ‘Quantum Brain’ Would Mimic Our Own to Speed Up AIon February 9, 2021 at 2:58 pm The main trick relies on the quantum spin properties of cobalt atoms ... It asks you to forget everything you know about computer design—chips, CPUs, memory hard drives. Instead, this type of new-age ... - Scientists take major step towards creation of ‘quantum brain’ that could completely change computerson February 2, 2021 at 3:33 am Scientists say they have made major steps towards the creation of a “quantum brain” – a computer ... hands off its information to a computer hard drive. “Until now, this technology ... - The first steps toward a quantum brainon February 1, 2021 at 2:26 pm Our new idea of building a 'quantum brain' based on the quantum ... and processing of information on a separate computer hard drive. 'Until now, this technology, which is based on a century ... - Toshiba And NEC Through The Lens Of Quantum Cryptographyon February 1, 2021 at 11:18 am "[Cryptography] is probably the key enabling technology for protecting distributed systems, but it is surprisingly hard to do ... players like China drive something of a “quantum cryptography ... via Bing News
<urn:uuid:9feb1ca1-ee62-42d7-89ae-496ee27730f7>
CC-MAIN-2021-10
https://innovationtoronto.com/2017/02/135314/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178368431.60/warc/CC-MAIN-20210304021339-20210304051339-00586.warc.gz
en
0.915206
1,167
3.828125
4
A device that eavesdrops on the quantum whispers of atoms could form the basis of a new type of quantum computer. Stanford physicists have developed a “quantum microphone” so sensitive that it can measure individual particles of sound, called phonons. The device, which is detailed July 24 in the journal Nature, could eventually lead to smaller, more efficient quantum computers that operate by manipulating sound rather than light. “We expect this device to allow new types of quantum sensors, transducers and storage devices for future quantum machines,” said study leader Amir Safavi-Naeini, an assistant professor of applied physics at Stanford’s School of Humanities and Sciences. Quantum of motion First proposed by Albert Einstein in 1907, phonons are packets of vibrational energy emitted by jittery atoms. These indivisible packets, or quanta, of motion manifest as sound or heat, depending on their frequencies. Like photons, which are the quantum carriers of light, phonons are quantized, meaning their vibrational energies are restricted to discrete values – similar to how a staircase is composed of distinct steps. “Sound has this granularity that we don’t normally experience,” Safavi-Naeini said. “Sound, at the quantum level, crackles.” The energy of a mechanical system can be represented as different “Fock” states – 0, 1, 2, and so on – based on the number of phonons it generates. For example, a “1 Fock state” consist of one phonon of a particular energy, a “2 Fock state” consists of two phonons with the same energy, and so on. Higher phonon states correspond to louder sounds. Until now, scientists have been unable to measure phonon states in engineered structures directly because the energy differences between states – in the staircase analogy, the spacing between steps – is vanishingly small. “One phonon corresponds to an energy ten trillion trillion times smaller than the energy required to keep a lightbulb on for one second,” said graduate student Patricio Arrangoiz-Arriola, a co-first author of the study. To address this issue, the Stanford team engineered the world’s most sensitive microphone – one that exploits quantum principles to eavesdrop on the whispers of atoms. In an ordinary microphone, incoming sound waves jiggle an internal membrane, and this physical displacement is converted into a measurable voltage. This approach doesn’t work for detecting individual phonons because, according to the Heisenberg uncertainty principle, a quantum object’s position can’t be precisely known without changing it. “If you tried to measure the number of phonons with a regular microphone, the act of measurement injects energy into the system that masks the very energy that you’re trying to measure,” Safavi-Naeini said. Instead, the physicists devised a way to measure Fock states – and thus, the number of phonons – in sound waves directly. “Quantum mechanics tells us that position and momentum can’t be known precisely – but it says no such thing about energy,” Safavi-Naeini said. “Energy can be known with infinite precision.” The quantum microphone the group developed consists of a series of supercooled nanomechanical resonators, so small that they are visible only through an electron microscope. The resonators are coupled to a superconducting circuit that contains electron pairs that move around without resistance. The circuit forms a quantum bit, or qubit, that can exist in two states at once and has a natural frequency, which can be read electronically. When the mechanical resonators vibrate like a drumhead, they generate phonons in different states. “The resonators are formed from periodic structures that act like mirrors for sound. By introducing a defect into these artificial lattices, we can trap the phonons in the middle of the structures,” Arrangoiz-Arriola said. Like unruly inmates, the trapped phonons rattle the walls of their prisons, and these mechanical motions are conveyed to the qubit by ultra-thin wires. “The qubit’s sensitivity to displacement is especially strong when the frequencies of the qubit and the resonators are nearly the same,” said joint first-author Alex Wollack, also a graduate student at Stanford. However, by detuning the system so that the qubit and the resonators vibrate at very different frequencies, the researchers weakened this mechanical connection and triggered a type of quantum interaction, known as a dispersive interaction, that directly links the qubit to the phonons. This bond causes the frequency of the qubit to shift in proportion to the number of phonons in the resonators. By measuring the qubit’s changes in tune, the researchers could determine the quantized energy levels of the vibrating resonators – effectively resolving the phonons themselves. “Different phonon energy levels appear as distinct peaks in the qubit spectrum,” Safavi-Naeini said. “These peaks correspond to Fock states of 0, 1, 2 and so on. These multiple peaks had never been seen before.” Mechanical quantum mechanical Mastering the ability to precisely generate and detect phonons could help pave the way for new kinds of quantum devices that are able to store and retrieve information encoded as particles of sound or that can convert seamlessly between optical and mechanical signals. Such devices could conceivably be made more compact and efficient than quantum machines that use photons, since phonons are easier to manipulate and have wavelengths that are thousands of times smaller than light particles. “Right now, people are using photons to encode these states. We want to use phonons, which brings with it a lot of advantages,” Safavi-Naeini said. “Our device is an important step toward making a ‘mechanical quantum mechanical’ computer.” The Latest on: Quantum computer via Google News The Latest on: Quantum computer - Roche and Cambridge Quantum Computing Use Algorithms for Early Alzheimer's Drug Researchon March 1, 2021 at 12:03 pm The companies intend to design and implement algorithms for the early stages of research for drug discovery and development of drug candidates to treat Alzheimer’s disease. - Machine Learning Cuts Through the Noise of Quantum Computingon March 1, 2021 at 10:37 am Quantum technologies seem poised to disrupt the world of high-performance computing, but developing – and stabilizing – the technology itself poses ... - How to profit from quantum technology without building quantum computerson March 1, 2021 at 8:52 am There are a number of lower risk opportunities to invest in quantum technologies, other than quantum computers, but to make the most of them both specialist knowledge and market awareness are required ... - How to get started in quantum computingon March 1, 2021 at 5:12 am To the untrained eye, a circuit built with IBM’s online Quantum Experience tool looks like something out of an introductory computer-science course. Logic gates, the building blocks of computation, ... - Quantum computing simulates materials superfaston March 1, 2021 at 5:00 am Quantum kittens three million times fasterScientists from quantum computing company D-Wave have demonstrated that, using a method called quantum annealing, they could simulate some materials up to ... - Quantum systems learn joint computingon February 23, 2021 at 4:00 pm Researchers realize quantum-logic computer operation between two separate quantum modules in different laboratories. - IBM adds 10 historically Black colleges and universities to quantum computing centeron February 23, 2021 at 10:46 am The IBM-HBCU Quantum Center announced on Monday that it is adding 10 historically Black colleges and universities to the center's 13 founding institutions. The center was launched ... - A quantum computer just solved a decades-old problem three million times faster than a classical computeron February 23, 2021 at 7:27 am Wave's researchers demonstrated that a quantum computational advantage could be achieved over classical means. - Could quantum computers fix political polls?on February 23, 2021 at 4:32 am If a quantum system can predict the locations of air molecules in a hurricane, you’d think predicting election results would be a much simpler problem. A quantum physicist and a neuroscientist tell us ... - Lack of symmetry in qubits can't fix errors in quantum computing, might explain matter/antimatteron February 22, 2021 at 1:50 pm A team of quantum theorists seeking to cure a basic problem with quantum annealing computers—they have to run at a relatively slow pace to operate properly—found something intriguing instead. While ... via Bing News
<urn:uuid:c2f8d0c9-a772-421d-a3e7-3125ce93be29>
CC-MAIN-2021-10
https://innovationtoronto.com/2019/07/the-basis-of-a-new-type-of-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178363217.42/warc/CC-MAIN-20210302034236-20210302064236-00186.warc.gz
en
0.921657
1,842
3.71875
4
This article will be dealing with how computers calculate trigonometric ratios, logarithms, and exponents. We will be exploring the mathematics behind these functions and shall end with a proof for the famous e^πi = -1. The article would be a pretty light read for anyone familiar with basic differentiation formulas such as those for cos(x), sin(x), and e^x. Even if the reader isn’t aware of these formulas, I’ve tried my best to make the article approachable for a general audience. Let’s start by talking about polynomial. A polynomial is any function of a variable that involves only multiplication, subtraction, and addition. Polynomials are of different degrees and the degree of the polynomial is the highest power of the variable in the function. We denote the function by f(x) and it represents the mathematical processes we are carrying out on our variable x. Now our n-degree polynomial is given by : Imagine you are struck by the particularly brilliant thought which makes you ask if you can represent any function f(x) as one of these polynomials. For whatever reason, you decide that you shall first try to express sin(x) and cos(x) as one of these polynomials. You enthusiastically write down your first equation You cleverly come up with the idea of plugging in x as zero to eliminate all the x terms as zero to any power is zero in our polynomial. Now that we’ve gotten the constant out of the way, you now get down to the task of figuring out each of the coefficients for this polynomial. You learned somewhere that the derivative of sin(x), represented by d(sin(x))/dx = cos(x) and also conveniently learned that the derivative of ax^n, given by d(ax^n)/dx = (n)(a)(x^n-1) and know that the derivative of a constant c is zero. You write them down to remember these results along with a few other things you learned that you think might be useful. Since you know that cos(0) = 1, you go ahead and differentiate the equation f(x) and write it as f ‘(x) to get a new equation you can work with: You go ahead and continue differentiation multiple times and get : You notice that this is an infinite process, but the coefficients of every term with an even power are zero and that only the terms with odd powers of x remain. The odd powers of x seem to be the ones remaining and their coefficients seem to be of the form 1/(the power’s factorial) or- 1/(the power’s factorial ).The plus and minus alternate with every second term being negative. Factorial is the multiplication of all the natural numbers( in this case ) from one up to the number itself and is represented as the number followed by an exclamation mark. For example 1! = 1, 2! = 2(1), 3! = (3)(2)(1) , 4! = 4(3)(2)(1), and in general k! = k(k-1)(k-2)….(1). So you write down your observations: With that, you have converted sin(x), a function seemingly related only to triangles and circles into an infinite polynomial in which substituting any x will get you closer and closer to the value of sin(x) with the more terms you choose to add. Following a similar process for cos(x), we can obtain it’s polynomial and with some knowledge of limits of a function and such, we can also obtain the polynomial for e^x. Indeed, these exact formulas were the one’s calculators used to compute sin(x), cos(x), tan(x), or any number to another power. For exponentiation: The calculator calculates the value of log(k) and then substitutes xlog(k) into the e^x expansion. The above approach should also help you better grasp the fact that exponentiation isn’t just repeated multiplication and how raising numbers to the power of a fraction like 1/2 might not make sense under repeated multiplication, but makes sense when we think about the number as an input to our polynomial, which we then know how to work with. The precision of your calculator naturally depends on how many of these terms in the expansion it adds up but the infinite sum can be approximated pretty well with just a few terms as these terms get exponentially smaller and converge at a value. Now that we have managed to make exponentiation a polynomial, it would seem less absurd to input a complex number as the exponent due to the fact that the solutions/zeroes of several polynomials are often complex numbers. Flowing with this train of thought, let’s try raising e to the power of i, where i is the square root of -1. Now stare at our final expression for a while and try to notice some patterns and try simplifying this into two other infinite polynomials that we have discussed below. If you spotted it then great, but if not, here’s how it breaks down: With that, we have just defined a way to raise any number to a complex number. Now, let me prove what was promised in the title: We have just proved a result that many argue is the most beautiful result in all of mathematics, but we have more important things to think about. Let us look at what this means for any complex number z and it’s representation in the argand plane, where the usual y-axis is replaced by an imaginary axis which tells us the value b if z = x + yi. Much the same way we plot any point (x,y), a complex number x + yi can be represented by a line from the origin to the point (x,y). We call the length of this line the modulus of the complex number and the angle it makes with the x-axis it’s argument. This means that any complex number z can be written as |z| e^i𝞱 which would imply that the complex number z is the radius of a circle centered at the origin, with the length of the radius being |z| and the angle it’s the radius at angle 𝞱 with the x-axis. The x value of the complex number is |z|(cos(𝞱 )) and the y value is |z|(sin(𝞱 )). This leads us to z = |z| cos(𝞱 ) + |z| i sin(𝞱 ). This greatly simplifies the multiplication of complex numbers as: This shows us that if we take any line which represents a complex number, and multiply it by another number, it gets scaled( stretched or squished ) by the modulus of the second number and then rotated by an angle equal to the argument of the second complex number. This can be used in the scaling and rotation of objects or images by assigning each point or pixel a specific complex number and then multiplying it by a complex number whose argument is the angle you want to rotate by and whose modulus is the desired resizing scaling amount. These results are also quite significant for 2-D rotational motion in Newtonian mechanics, and the development of vectors and vector analysis, in fact, comes from complex numbers and higher dimension complex number systems called quaternions. I encourage the reader to try and code functions, recursive, or otherwise to compute the sin, cos, or log values using the polynomials I have mentioned today. You might also have several useful and key insights by thinking about the rotation properties I mentioned above and how they might help you calculate the nth-roots of real numbers by thinking of the real numbers as having arg(nπ) where n belongs to the integers. The same line of reasoning will also help you understand why complex roots to polynomials always come in pairs of two with both being conjugates. I encourage you to also go through the links I have provided below for better depth and understanding of the results I have used today and they will certainly help you see the bigger picture when it comes to the importance of these formulas. I would like to cover more serious topics by talking about things like quantum computing, the Fourier series, Fermat’s little theorem, and other crucial mathematical results that play a big role in modern computers. Hence, the articles I plan on writing will be pretty long and technical so please let me know if those are some topics you might be interested in. I mentioned that these polynomials were used in calculators initially, nowadays there are optimizations and matrices that can be used for computations, topics that I might cover in future articles. ~ Koka Sathwik
<urn:uuid:fefb5c8a-1aa9-4ecb-91b5-e49348e9ad6d>
CC-MAIN-2021-10
https://thecodestories.com/2020/05/02/calculators-rotation-and-e%CF%80i/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178369721.76/warc/CC-MAIN-20210305030131-20210305060131-00470.warc.gz
en
0.941496
1,864
3.84375
4
Quantum computing is hailed as the future holy grail of information processing. However, quantum computing machines are very complex, delicate, and cumbersome. They also require exotic materials such as superconducting metals or levitated atoms. But new developments demonstrated in two recently published studies may prove revolutionary — they suggest that quantum states can be controlled in regular, everyday devices. Quantum mechanics in classical semiconductors It’s difficult to grasp just how quantum computers work, but if we were to simplify things, the gist would be that digital computers require data to be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), whereas quantum computers use qubits, also known as quantum bits. A qubit is the quantum analog of the digital bit that encodes information in 1s and 0s. The crucial difference is that a quantum bit can exist in both states at the same time due to a quantum quirk called superposition. It’s akin to saying that a switch is both on and off at the same time or that water is both flowing and not flowing through a pipe simultaneously — which, in day to day life, makes absolutely no sense, but in the quantum domain, few things are reasonable. Two-qubits can perform operations on four values, three on eight values and so on in powers of two. Today’s computers have millions of transistors. Now imagine a quantum logic gate that works with millions of qubits. The computing force would be unheard of, allowing scientists to solve currently intractable problems and perform complex models that take longer than the age of the universe for today’s fastest supercomputers to process. Although classical computing systems have always been thought of as under-equipped to read and maintain quantum states, scientists at the University of Chicago’s Pritzker School of Molecular Engineering showed that this isn’t necessarily true. “The ability to create and control high-performance quantum bits in commercial electronics was a surprise,” said lead investigator David Awschalom, professor of molecular engineering at the University of Chicago. “These discoveries have changed the way we think about developing quantum technologies—perhaps we can find a way to use today’s electronics to build quantum devices.” Awschalom and colleagues published a paper in the journal Science, demonstrating they could electrically control quantum states embedded in silicon carbide. Immediately, this opens the possibility of designing quantum computers based on traditional materials, which could vastly accelerate their development. What’s more, quantum states in silicon carbide emit single particles of light with a wavelength near the telecommunications band. “This makes them well suited to long-distance transmission through the same fiber-optic network that already transports 90 percent of all international data worldwide,” said Awschalom, who is also a senior scientist at Argonne National Laboratory and director of the Chicago Quantum Exchange. In a second paper, which was published in Science Advances, the researchers were able to combine these light particles with existing electronics to make a “quantum FM radio”. They claim that just like the information that plays music in your car is transmitted through the air over long distances, so can quantum information be exchanged wirelessly. One important challenge that researchers managed to overcome was quantum noise. Common semiconductor devices have impurities, which can scramble quantum information by adding noise to the electrical environment. This is why quantum research exclusively uses pure materials that are free of fluctuating fields. But, the researchers managed to eliminate noise in the quantum signal by employing one of the most basic electronics components — the diode, a one-way switch for current. “In our experiments, we need to use lasers, which unfortunately jostle the electrons around. It’s like a game of musical chairs with electrons; when the light goes out everything stops, but in a different configuration,” said graduate student Alexandre Bourassa. “The problem is that this random configuration of electrons affects our quantum state. But we found that applying electric fields removes the electrons from the system and makes it much more stable.” For decades, consumers have enjoyed the dividends of Moore’s Law — an empirical observation that states the number of transistors in an integrated circuit doubles every two years or so. This prediction has proven true ever since it was first proposed by American engineer Gordon Moore in 1965. However, there’s a physical limit to how many transistors you can cram into a chip — and in a decade we should all be feeling it. But, by integrating quantum mechanics with classical semiconductor technology, these new developments suggest that we might not only avoid Moore’s brick wall but scale computing power to incredible heights. “This work brings us one step closer to the realization of systems capable of storing and distributing quantum information across the world’s fiber-optic networks,” Awschalom said. “Such quantum networks would bring about a novel class of technologies allowing for the creation of unhackable communication channels, the teleportation of single electron states and the realization of a quantum internet.” This is a syndicated post. Read the original post at Source link .
<urn:uuid:f49277a3-4a28-45b8-b49d-acad0950d57a>
CC-MAIN-2021-10
https://www.qpute.com/2019/12/11/physicists-produce-quantum-states-in-ordinary-electronics-via-qpute-com/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178385984.79/warc/CC-MAIN-20210309030723-20210309060723-00270.warc.gz
en
0.937646
1,083
3.84375
4
The photon is a sine wave related to the E and B fields, but only of one period. As an open string it has two ends. The question is, what does their physical reality look like? First, two parallel photons of the same wavelength are to be investigated. From the chapter about neutrinos as oscis it can be deduced that their approach can never be total. According to the dilemma of QT, this behavior cannot be attributed to the photon as a quantum, but is owed to field theory. After the dilemma of the QT, annihilation cannot occur in any case. So it just seems like there's zero point fluctuation. But after the TO they do not exist! Polarisation: Simply changing the right angle between the E and B fields is not allowed after the QT dilemma, but the right angle refers to the unbent space-time. Polarization can also be generated with photons twisted against each other. After the last chapter, however, no mathematically correct rotary polarization can occur. This should be observable experimentally! Polarization and torsion of the wave in the gravitational field are to be distinguished. In the latter case, the wave appears rotated in the before-after comparison. Shortest wavelength: It is determined in the TO by the minimum radius of curvature of the circular wave in the plane of the E field. The reason for this is that quantum processes cannot have a smaller wavelength. With 5,876516699923 10-16 m as minimum radius is λ0 = 3,69232433863517 10-15 m,and thus E0 = h c/λ0 = 0,33578900862721 GeV (in the gamma radiation range). The quantum number, which stands for the area integral of the sine wave, was determined arbitrarily. The area remains intact if the wavelength and amplitude are changed in inverse proportion. One more comment on the maximum energy of photons: Photons can link up. It is then a quantum object. The behavior of the graph is more complex because the amplitude influences its length change. The reference figure is the extension factor λ'. It sets the length difference of graph and wavelength in relation to the wavelength, which should be in the counter of the quotient. Its function curve as a function of amplitude a is known: If a < 1, their function value runs asymptotically against 0,25,if a > 1, it runs asymptotically against 0, which means that a = 1 is a turning point.Even if λ0' still increases with increasing wavelength, remains λ0' under λ0' ‧ 1,16, because 0,25/0,216... = 1,15739... < 1,16. The shortest wavelength is identified with a = 1, because then the extreme values lie on top of each other. If a were greater than 1, there would have to be an effect that can be associated with the turning point. This is not known. The arguments are admittedly weak, but perhaps a stronger argument will be found. For the shortest wavelength λ0 = 3,69232433863517 10-15 m is then λ0' = 0,2160028025443. The wavelength of the photon must remain constant in empty space. In the prestressed space-time continuum, this can only be ensured by limiting the oscillation space to the wavelength, which again results in the principle of constriction. The corresponding space-time line is constricted to the wavelength based on the length of the graph. With wD = - 1,09020236896306 10-11 kgm/s2 as the energy density of the space-time line, it is possible to for the photon with the shortest wavelength, establish the following energy equation: E0 ART + E0 kin = wD λ0 λ0' + E0 kin = -8,69493521355215 10-27 kgm²/s² + E0 kin = 0,where E0 ART corresponds to a mass defect of -6,03829837739599 10-25 eV/c². Even the longest wavelength photon thus remains below 1.16 times this value! The above equation must give 0 because the TO does not allow a negative energy balance (no zero fluctuation). So the photon is only massless at c - see also INTERPRETATION OF THE MASS. The first summand is the energy with which the photon counteracts its extension (holding the wavelength). With a higher negative energy density, this no longer works, which leads to a redshift. The above equation can also be interpreted differently. With the extension factor λ0' the photon is ironed (amplitude = 0). Thus, the photon once again confirms the conclusion that the universe has an event horizon. If, conversely, the rest mass were known, wD could be confirmed. Gravitational and electromagnetic field theories show a relationship. This is reflected in the gravitational shift of the wavelength and the gravitational lens effect. Due to this relationship, it cannot be ruled out that they may have a resonant effect. The damping factors must be determined for this purpose. Since a photon remains photon as long as it is not absorbed, its electromagnetic damping is 0. A resonance catastrophe cannot occur with the dilemma of QT at the quantum level! If the damping in the gravitational space were to be √½, the resonance increase would be 1. Thus the resonance amplitude would decrease exactly to the same extent as the effect of gravity. At stronger damping, gravity waves would not be detectable as forced swinging of astronomic events. In order not to have to think about a solution to Einstein's field equations, it is first a question of the prerequisite that leaves the constriction area unchanged. Its relativistic compression is of no interest. In any case, this requires that the divergence of the vector field at the ends of the constriction must remain 0 at certain points, because this is the only way to maintain the one-sided unsteadiness at the ends of the electromagnetic wave. Einstein's field equations allow this (conservation of energy and impulses). If one considers the preservation of the constriction under the aspect of the resonance, then the damping must be √½, because only in this way the divergence remains 0. Divergence 0 affects the problem known as the classical borderline case of the harmonic oscillator. Classically interpreted, this results in a sharply defined oscillation. In quantum mechanics this becomes the wave equation (see right). The wave oscillates beyond the boundary, which can no longer correspond to quantum reality with the TO. Combined, the above dampings lead to resonance in the one room of the TO, which is actually a sandwich of rooms - see SEPARATION OF ROOMS. This explains the spooky distant effect of the photon. Since the oscillation propagates gravitatively with c, the oscillation is at the latest there, where the photon is, which reminds of the fairy tale "The rabbit and the hedgehog". Quantum entanglement turns out to be a resonance event, and the ERP paradox is no longer one! last modification 24.02.2019
<urn:uuid:6e53efd8-3508-4934-b136-9453360c50fd>
CC-MAIN-2021-10
http://wolfgang-kleff.de/spooky-photon.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178381230.99/warc/CC-MAIN-20210307231028-20210308021028-00392.warc.gz
en
0.93151
1,518
3.75
4
First, let’s start by analyzing the concept and components of classical computing. Classic computers will obey the principles of classical physics. A classical computer will perform operations using information stored in the form of bits; whose value is either zero (0) or one (1). Now, when we program a classical computer; we will have a CPU which has an input, an output, and software which regulates the CPU. This is called a Turing Machine, which also happens to be the substructure of your cell phones and laptops computing power. In spite of the relative simplicity, a Turing machine may be constituted to simulate any given computer algorithm’s logic. Unfortunately, even as classical computers have become faster and more concise; they are unable to solve arithmetic like factoring massive integers effectively. In quantum computing, instead of having information stored in the form of bits, we have a new unit called a qubit or quantum bit, which carries quantum information. In a classical system a bit can only be in two positions; either up or down (commonly represented as a zero or one). In quantum computing, the qubit can be in any superposition of both at the same time. Qubits can be in the in the given states |0} and |1} (note: 0 and 1 are not always the given values for a qubit, various others may be used but with the same result) as well as any addition of the two; which will yield another valid quantum state x|0} + y|1} where the two variables x and y represent complex numbers. With this basic knowledge, we can analyze the processor inside a quantum computer; specifically the D Wave quantum computer. The Elementary Units of Quantum Computing In the introduction, we covered how we can represent qubits symbolically as a 0 or 1, as well as a superposition of both of the states. We will now cover how qubits are constructed as well as their appearance. In conventional computing, we are using the CMOS transistors to encode bits of information. This is done by regulating the voltage to transistors that are fitted with a bus to determine whether the state is a 0 or 1. Quantum transistors are somewhat similar, yet vastly different than our current CMOS transistors. Interference refers to the actual electrons, and how they act as waves that create interference patterns to cause quantum effects to occur. This is the basis of quantum computing (basically a quantum transistor.) The electron behaves as a qubit due to the nature of the material called niobium; which is what the gold loop is made of. When the niobium is cooled to reach its critical temperature; it will manifest the qualities of quantum mechanics. Our classic transistors will encode in two states by regulating voltages. The SQUID will encode the two states into magnetic fields which are designated down or up. The two states are given as -1, +1 in which the qubit can be in superposition of both. This is done by combining the Josephson Effect (or the phenomenon of supercurrent) and the quantization of flux. BCS pairs are tunneled through a weak link (which in this case would be a weak insulating barrier) between the niobium. For each current below a given critical value, a supercurrent will be established between the two superconductors and will yield no voltage across the Josephson junction. Any time a current is larger than the critical value, a voltage will be read across the junction. The qubits need to be linked together in a fashion that is capable of relaying information. The qubits are attached together by couplers which are also made from superconducting material. When we combine the qubits and couplers together we are capable of creating a programmable structure of quantum mechanics. The superconducting qubit is formed into rectangles; with each of the dots representing a coupler. These couplers would in a sense couple the data or variables in an equation making it more efficient to solve. Unfortunately, there are more components needed to create a functional quantum processor. Much of the structure and circuitry that outlines the qubits are composed of switches that function by the Josephson Effect. This circuitry directs the information from the qubits into various memory components which store the data into a magnetized medium. Each of the qubits is equipped with read-out apparatuses. The read-out will take the vector from the coherent superposition state and project it into a pure zero or one state while losing the phase information. The probability of projection into zero or one state is taken by repeating the procedure many times and averaging the result. These apparatuses will be inoperative while calculations are being made for the qubits to prevent the quantum behavior from being changed. Once calculations have been completed, and after each qubit has established itself into a classical state, the recorded data is converted into a chain of classical bits which can then be read by the user. The structure of the processor is different from the typical silicon processor in that each qubit has individual memory devices instead of large cache areas. Quantum processing has been speculated to be able to utilize computing power massive orders of magnitude more than our conventional computers. If we take a coherent state qubit system with X qubits then we can superpose 2X different sequences of bits (remember that each additional qubit will yield twice as many values, which is where the 2X comes from.) Now to equate that to conventional computers we take the difference in energy levels of the qubit, in this case, it happens to be in the gigahertz region; which gives us 2X gigahertz. This means with 20 qubits a quantum processor could process approximately 2^20 operations per second. We can conclude that quantum processors have a substantially greater potential than that of conventional computers. Recently the Dwave 2X system was manufactured and is considered to be the most powerful quantum computer to date. It happens to operate at 0.015° above absolute zero, and its processor generates no heat. The system is comprised of over 1000 qubits that operate near absolute zero to generate a massive amount of quantum effects. To put this into perspective, the system can search through 2^1000 solutions at once; which is more than every particle in the universe. The Dwave 2X has a rumored list price north of $15,000,000, and has been released for general availability.
<urn:uuid:41cb13a7-5271-467f-9f3c-9abce8e13b80>
CC-MAIN-2021-10
https://www.allpcb.com/sns/the-basic-analysis-of-a-quantum-processo_2775.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178360107.7/warc/CC-MAIN-20210228024418-20210228054418-00435.warc.gz
en
0.940321
1,323
4
4
Can quantum technology improve the performance of batteries? The answer is yes. A project led by researchers at the University of Sussex is using quantum sensors to measure battery behavior, with the expectation that the resulting data can be used to improve battery technology. The project has been awarded with the University of Birmingham’s Partnership Resource Funding, UK Quantum Technology Hub Sensors and Timing. The project team also includes the Universities of Strathclyde and Edinburgh as part of the consortium. The project addresses a crucial need to increase energy density, durability and safety in batteries, thus driving the industrial revolution towards an increasingly green ecosystem. To achieve these and other green goals, intensive research and development in these areas are needed while implementing environmental policies. In an interview with EE Times, Peter Kruger, research professor of experimental physics at the University of Sussex, highlighted how batteries seem to be the first big market for quantum battery sensors, as EVs require large battery packs with high storage capacity. “That would mean the first significant commercial impact of quantum sensors,” said Kruger. Battery and quantum technology New electric vehicle control systems, including regenerative braking systems, start & stop functionality, and the electric motors that drive the wheels, all require accurate measurement and control of electrical inputs to optimize performance and avoid catastrophic failure. An essential part of these systems is the battery current measurement sensor, which measures the battery charge and discharge level and its state of health. There are several existing technologies to create a good current sensor for vehicle battery monitoring. At the same time, simulating the chemical structure of batteries using quantum computing makes it possible to apply these algorithms to reproduce the chemical composition inside a battery according to various criteria, such as weight reduction, maximum density, and cell assembly. This speeds up the industrialization of the battery pack itself. The University of Sussex project The goal of the project is to implement quantum magnetometer technology to examine if microscopic battery current flows accurately. In this way, rapid assessments of the chemistries of new and existing batteries will accelerate the creation of superior battery technology, thereby facilitating electrification. A magnetometer is an instrument with a single sensor that measures magnetic flux density. Quantum magnetometers are based on the spin of subatomic particles. The coupling of each particle magnetic moment with the applied field is quantized or limited to a discrete set of values as determined by the laws of quantum mechanics. Kruger pointed out that there have been many cases of lithium battery failures in recent years that have made the headlines, such as the case of Samsung’s Galaxy Note 7 smartphone. Monitoring the current flow could allow preventive actions to be taken before these battery failures occur. A quantum sensor could provide batteries with a some sort of intelligence by monitoring their health and reducing the most worn cells load. “Current battery monitoring solutions mainly focus on measuring the voltage across batteries. This gives a good indication of the charge left inside a battery, and by measuring these voltages during many subsequent charge/discharge cycles, the charge capacity can be monitored as the battery degrades,” said Kruger. He added, “While these measurements are useful to monitor the battery state of health, they do not tell us much about what is going on inside the battery. The quantum systems in development allow the magnetic fields generated by the battery to be measured, which are used to deduce the electrical currents that flow through the battery. This system acts as a “magnetic camera”, able to peer inside the battery.” The research group’s aim is to develop small, low-power, portable devices that require no infrastructure and minimal running costs, thus being suitable for economic production. The academics will also work closely with CDO2, Magnetic Shields Ltd and QinetiQ to achieve their goal. Magnetic Shields Ltd will provide the magnetic noise-free environment required to allow the sensor technology to be tested with unprecedented sensitivity. “Large application is in the research sector, where battery manufacturers can bench-test different chemistries and cell geometries. The sensors could send diagnostic information to the on-board computer of an EV and could reduce the service interval as manual check-ups are no longer required. Battery farms are being developed as a form of renewable energy storage, and the technology can be adapted to be used as a smart-charging system, as well as monitoring the battery state of health,” said Kruger. The big challenge at the moment is focused on raising the capacity of the batteries. “Technology-wise the sensors are not just sensitive to magnetic fields from the battery, but from all ferromagnetic substances. Much of the work we carry out is in the design of the sensors, and looking at how we can shield them from external magnetic sources. We have to think about how the system will be able to filter out the magnetic fields generated by the car’s electric motor, or quick changes in magnetic fields as around a ton of metal passes the sensor each time a car passes in the other direction. A full supply chain for all relevant components needs to be established. We’re well underway doing that through concurrent Industrial Strategy funding,” said Kruger. Batteries are the key to decarbonization, but improvements are needed in both chemistry and boundary technology. Lithium-ion batteries are still the gold standard technology in this field, and have come a long way. Checking each battery is an operation that has to take into account many factors, such as leaks and imperfections, which adversely affect the performance of the entire system, whether it is an electric vehicle or a simple consumer device. The article originally published at sister publication EE Times.
<urn:uuid:4dbf05b8-9a91-482d-8210-bb9e6d0ebcdb>
CC-MAIN-2021-10
https://www.electronicproducts.com/using-quantum-sensors-to-improve-battery-performance/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178359082.48/warc/CC-MAIN-20210227174711-20210227204711-00354.warc.gz
en
0.941324
1,161
3.53125
4
Data storage is the containment of any type of information in a particular location. Though today it is typically used to describe storing applications, files and other computing resources, it has existed as long as humans have. Data has been commonly stored and managed by memorizing, carving, writing, recording sound and video, printing type, taping, programming, creating files and powering servers. It is estimated that the world will create 44 zettabytes of data in 2020; that’s 687 billion times larger than the data contained in all the scrolls in the Great Library of Alexandria, the largest library of the ancient world. And that number grows exponentially every year. Storing, managing and securing all that data requires enormous computing power and physical storage devices such as hard drives, flash memory, solid state drives and data tapes, whether on laptops, mobile devices or on servers in a cloud or data center. It also makes issues such as data storage integrity, reliability, and compatibility extremely important; nothing less than preserving the record of our civilization is at stake. Historical data storage Ancient data storage was both thorough and intricate. Ancient tribes memorized lengthy pieces of history and literature and handed them down through generations by regular recitation and practice. The Bible records data about the 12 tribes of Israel and head-counted certain tribe members. Thousands of years later, that data is preserved. Ancient people carved drawings, writing and numerical values on cave walls, stone tablets and pieces of clay, many of which still exist. The abacus and other calculation methods managed numerical data. The Antikythera mechanism (photo at right courtesy Giovanni Dall’Orto) was an advanced time-tracking tool that used computing processes, dials, and gears to track astronomical movement and calendar dates. It was found in 1900 on a sunken ship near a Greek island. It is known as the first analog computer. The Antikythera mechanism produced data about the stars and the calendar years in advance. The advanced design of this computing tool suggested that it was not the first one to be designed. Medieval data storage is less notable (years 500-1300 AD were not called the Dark Ages mistakenly), perhaps partly because ancient invention sank into oblivion for many years and historical records from medieval times are more fuzzy. (After the aforementioned Antikythera mechanism, similar machines didn’t appear to be invented for a good 1,300 years). However, the popularity of writing on parchment and the development of books marked an important step in storing data. During this period, as monks and scribes painstakingly created books filled with color and design, data storage became a work of art as well as a method of recording information. In the 15th century, Gutenberg invented the printing press. Typesetting allowed people to make information much more available much more quickly. Though books were considered the property of the extremely wealthy or at least well-to-do for centuries more, they put physical copies of data into many more people’s hands. This not only increased the development of learning but also provided all people with the opportunity to analyze governmental and philosophical processes for themselves and challenge injustice. During the industrial age, multiple inventors created machines that performed calculations and stored information; notably, Charles Babbage, in the 19th century, designed an early computer. The term business intelligence came into use in the mid-19th century, initiated by carefully discerned statistics and indicating the storage and analysis of data. Computing machines became very important in the world wars, in which they assisted in breaking codes, planning attacks and dropping bombs. A side note regarding the most advanced kind of data storage: Though it may seem overly straightforward, the brain is much more advanced than any computer or network in its ability to process and use data (artificial intelligence is one of the more advanced forms of technology, and even it can only hope to catch up with the brain). The human mind can store data through memorization (as mentioned earlier) and through naturally intaking information. The brain manages the inner workings of many different systems through electric signals and stores data through its natural processing and advanced analytics system. Pre-digital data, file and image storage Before data storage providers went digital, there were a few providers who specialized in safeguarding data and files on paper, on film, in images, on objects and in other formats. Most of these companies are still in business, because not everything that needs to be stored and protected is in a computer system. Companies such as Iron Mountain (started up in 1951) and competitors Access Information Management, Hewlett Packard Enterprise, H3C and CoreSite Realty are known to build and maintain super-secure storage facilities–both above and below ground–in order to safeguard valuable public and private information and artifacts. These storage providers still play an important role in real-world use cases. For example, Iron Mountain protects a high percentage of Hollywood movie history–thousands of cans of physical film dating back to the late 19th century–in an underground vault in the West Los Angeles area. Iron Mountain and others also store a great deal of information and artifacts for the federal, state and local governments. Computer data storage In a modern computer, a central processing unit (CPU) is the control center for the computer, giving commands that the computer then executes. It is connected to primary storage, or main memory. Random access memory (RAM), part of main memory, processes data that the CPU requests, but it cannot process much at once. Secondary storage, however, stores data in the background, where it can be accessed by computer memory and brought into primary storage, or RAM, for processing. Multiple types of hardware are available for storing and processing data. Hard disks store more data than soft disks and can process information more quickly. Soft (floppy) disks, though easier to transport and purchase, are much less secure. Direct-attached storage refers to data storage that’s attached to a computer or server rather than over a network. This makes it readily available, which is beneficial if a network is down and a user needs to access data. Solid state drives (SSDs) are just one example of direct-attached storage: external hard drives, which can be an SSD or hard disk drive, plug into a computer, allowing users to instantly access the data stored within the drive. Software-defined storage (SDS) manages software and hardware, such as servers in a data center, from a distance. SDS can control multiple environments and allows flexible data storage (on servers, pieces of hardware, virtual machines, etc). It’s more abstract than on-premises storage, but it also provides many more compute resources and greater flexibility. Data centers were initially developed in the mid-1900s (perhaps initially modeled after ENIAC — photo at right — one of the first computers), but their usage grew much more quickly in the late 1990s. As demand for computing skyrocketed, huge infrastructures were built to meet the need. Now data centers exist physically and virtually. Google has 11 physical centers in the United States alone and 19 globally (as of 2020). Data centers require enormous amounts of management, cooling and security monitoring; they must also be placed in locations with minimal natural disaster tendencies. Modern data storage concerns Though the flexibility and agility of data storage has improved through software-defined and hybrid cloud environments, this doesn’t solve the problem of obsolete storage methods. Throughout history, storage methods have increasingly become less sturdy, if also easier to use. Storing data through technology is still relatively abstract compared to previous methods of storage—such as rock carving, which could only be lost if physically misplaced or damaged over hundreds of years of weather. In contrast, technology becomes obsolete so quickly (more so than paper, which came before it), and users run the risk of losing their information if they can’t find a new location to properly store and process it. Different formats and generations of technology make old files obsolete quickly, and occasionally unreadable, necessitating the migration of data from one generation of technology to the next. Videos are one example: they’re challenging to transfer between mediums, and the technology that reads them (VHS and DVD drives, for example) can fail and storage devices deteriorate over time. Even with the significant expansion of the cloud, computing processes still have to run on servers, and if technology shifts further, users may struggle to save all of their important data. And error rates during storage and transmission are also a threat to the integrity of data – if enough bits flip from 0 to 1 or vice versa, a file may become unreadable. While quantum computing is an attempt to move beyond the limits of modern data storage and computing, at its most basic level, data storage remains a digital process, defined by just two binary values, 1 and 0. One of the most common types of enterprise data storage–RAID, or redundant array of independent disks–is an attempt to limit the risk of disk failures by spreading out data and duplicating it. Backup is an essential data protection strategy and can even help fight security threats, such as ransomware. The more amounts of data people store, the more information they risk losing, and the more they need strategies to protect and preserve it. The importance of backing up data has increased as users rely further on technology. Accessing data in the cloud (using Google Drive to create documents, for example) is one helpful method, but it’s also important to save files on an external device such as a hard drive. The most important files should ideally be kept physically outside a computer network (in print form). You could also attempt carving them into a rock, depending on the relative importance of the file.
<urn:uuid:148b1df2-7c56-44d8-9753-90286886ba15>
CC-MAIN-2021-10
https://www.webopedia.com/definitions/data-storage/
s3://commoncrawl/crawl-data/CC-MAIN-2021-10/segments/1614178385984.79/warc/CC-MAIN-20210309030723-20210309060723-00273.warc.gz
en
0.947568
2,010
3.765625
4
'Molecular spintronics': New technology offers hope for quantum computing Quantum computers, which work according to the strange rules of quantum mechanics, may one day revolutionize the world. Once we have managed to build a powerful working machine, it will be able to solve some problems that take today's computers millions of years to compute. Computers use bits (zero or one) to encode information. Quantum computers use "qubits"—which can take any value between zero and one—giving them huge processing power. But quantum systems are notoriously fragile, and although progress has been made to build working machines for some proposed applications, the task remains difficult. But a new approach, dubbed molecular spintronics, offers fresh hope. In 1997, theoretical physicists Daniel Loss and David DiVincenzo laid down the general rules necessary for creating a quantum computer. While normal electronic devices use electric charge to represent information as zeros and ones, quantum computers often use electron "spin" states to represent qubits. Spin is a fundamental quantity we've learned about through quantum mechanics. Unfortunately, it lacks an accurate counterpart in everyday experience, even though an analogy of a planet spinning on its own axis is sometimes used. We do know that electrons spin in two different directions or "states" (dubbed up and down). According to quantum mechanics, each electron in a material spins in a combination (superposition) of these states—a certain bit up and a certain bit down. That's how you can get so many values rather than just zero or one. Among the five requirements for building a quantum computer developed by Loss and DiVincenzo included the possibility of scaling up the system. More qubits mean more power. Another was making information survive for reasonable amounts of time once encoded, while others concerned the initialization, manipulation and read-out of the physical system. Although originally conceived for a quantum computer based on electron spins in tiny particles of semiconductors, the proposal has now been implemented across many physical systems, including trapped ions, superconductors and diamonds. But, unfortunately, these require a near perfect vacuum, extremely low temperatures and no disturbances to operate. They are also hard to scale up. Spintronics is a form of electronics based on spin rather than charge. Spin can be measured because it generates tiny magnetic fields. This technology, which often uses semiconductors for manipulating and measuring spin, has already had a huge impact on improving hard drive information storage. Now, scientists are realizing that spintronics can also be done in organic molecules containing rings of carbon atoms. And that connects it with a whole other research field called molecular electronics, which aims to build electronic devices from single molecules and films of molecules. The combination has proven useful. By carefully controlling and manipulating an electron's spin within a molecule, it turns out we can actually do quantum computations. The preparation and readout of the electron's spin state on molecules is made by zapping them with electric or magnetic fields. Carbon-based organic molecules and polymer semiconductors also address the criteria of being easy to scale up. They do this through an ability to form molecular frameworks, within which molecular qubits sit in close proximity with each other. The tiny size of a single molecule automatically favors packing large numbers of them together on a small chip. In addition, organic materials disturb quantum spins less than other electronic materials do. That's because they are composed of relatively light elements such as carbon and hydrogen, resulting in weaker interactions with the spinning electrons. This avoids its spins from easily flipping state, causing them to be preserved for long periods of up to several microseconds. In one propeller-shaped molecule, this duration can even be up to a millisecond. These relatively long times are sufficient for operations to be performed—another great advantage. But we still have much left to learn. In addition to understanding what causes extended spin lifetimes on organic molecules, a grasp on how far these spins can travel within organic circuits is necessary for building efficient spin-based electronic circuits. The figure below shows some of our concepts for exploratory organic spintronic devices towards this goal. There are also major challenges in getting such devices to work efficiently. The charged electrons that carry spins in an organic material constantly hop from molecule to molecule as they move. This hopping activity is unfortunately a source of electrical noise, making it difficult to electrically measure small spin current signatures using conventional architectures. That said, a relatively new technique known as spin pumping might prove suitable for generating spin currents with low noise in organic materials. Another problem when trying to make organic molecules serious candidates within future quantum technologies is the ability to coherently control and measure spins on single molecules, or on a small number of molecules. This grand challenge is currently seeing tremendous progress. For example, a simple program for a quantum computer known as "Grover's search algorithm" was recently implemented on a single magnetic molecule. This algorithm is known to significantly reduce the time necessary to perform a search on an unsorted database. In another report, an ensemble of molecules were successfully integrated into a hybrid superconducting device. It provided a proof-of-concept in combining molecular spin qubits with existing quantum architectures. Much is left to be done, but in the current state of play, molecular spin systems are fast finding several new applications in quantum technologies. With the advantage of small size and long-lived spins, it is only a matter of time before they cement their spot in the roadmap for quantum technologies.
<urn:uuid:143a06d1-bf29-42ce-92a7-34709a236ea5>
CC-MAIN-2021-04
https://phys.org/news/2019-10-molecular-spintronics-technology-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703524858.74/warc/CC-MAIN-20210121132407-20210121162407-00643.warc.gz
en
0.937453
1,127
4.125
4
Chasing clues about the infant universe in relic light known as the cosmic microwave background, or CMB, scientists are devising more elaborate and ultrasensitive detector arrays to measure the properties of this light with increasing precision. To meet the high demand for these detectors that will drive next-generation CMB experiments, and for similar detectors to serve other scientific needs, researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) are pushing to commercialize the manufacturing process so that these detectors can be mass-produced quickly and affordably. The type of detector they are working to commercialize incorporates sensors that, when chilled to far-below-freezing temperatures, operate at the very edge of superconductivity—a state in which there is zero electrical resistance. Incorporated in the detector design is transition-edge sensor (TES) technology that can be tailored for ultrahigh sensitivity to temperature changes, among other measurements. The team is also working to commercialize the production of ultraprecise magnetic field sensors known as SQUIDs (superconducting quantum interference devices). In the current TES detector design, each detector array is fabricated on a silicon wafer and contains about 1,000 detectors. Hundreds of thousands of these detectors will be needed for a massive next-generation CMB experiment, dubbed CMB-S4. The SQUID amplifiers are designed to enable low-noise readout of signals from the detectors. They are intended to be seated near the detectors to simplify the assembly process and the operation of the next-generation detector arrays. More exacting measurements of the CMB light’s properties, including specifics on its polarization—directionality in the light—can help scientists peer more deeply into the universe’s origins, which in turn can lead to more accurate models and a richer understanding of the modern universe. Berkeley Lab researchers have a long history of pioneering achievements in the in-house design and development of new detectors for particle physics, nuclear physics, and astrophysics experiments. And while the detectors can be built in-house, scientists also considered the fact that commercial firms have access to state-of-the-art, high-throughput microfabricating machines and expertise in larger-scale manufacturing processes. So Aritoki Suzuki, a staff scientist in Berkeley Lab’s Physics Division, for the past several years has been working to transfer highly specialized detector fabrication techniques needed for new physics experiments to industry. The goal is to determine if it’s possible to produce a high volume of detector wafers more quickly, and at lower cost, than is possible at research labs. “What we are building here is a general technique to make superconducting devices at a company to benefit areas like astrophysics, the search for dark matter, quantum computing, quantum information science, and superconducting circuits in general,” said Suzuki, who has been working on advanced detector R&D for about a decade. This breed of sensors has also been enlisted in the hunt for a theorized nuclear process called neutrinoless double-beta decay that could help solve a riddle about the abundance of matter over antimatter in the universe, and whether the ghostly neutrino particle is its own antiparticle. Progress toward commercial production of the specialized detectors has been promising. “We have demonstrated that detector performance from commercially fabricated detectors meet the requirements of typical CMB experiments,” Suzuki said. Work is underway to build the prototype detectors for a planned CMB experiment in Chile known as the Simons Observatory that may incorporate the commercially produced detectors. About 3 miles above sea level, in the Atacama Desert of Northern Chile, researchers have worked on successive generations of TES-based detector arrays for CMB-related experiments including POLARBEAR, POLARBEAR-2, the Simons Array, and the Simons Observatory. A detector array for two telescopes that are part of the POLARBEAR-2 and Simons Array experiments is now being fabricated at UC Berkeley’s Marvell Nanofabrication Laboratory by Berkeley Lab and UC Berkeley researchers. The effort will ultimately produce 7,600 detectors apiece for three telescopes. The first telescope in the Simons Array has just begun its commissioning run. The Simons Observatory project, which is now in a design and prototyping phase, will require about 80,000 detectors, half of which will be fabricated at the Marvell Nanofabrication Laboratory. These experiments are driving toward a CMB-S4 experiment that will combine detector arrays in Chile and near the South Pole to better resolve the cosmic microwave background and possibly help determine whether the universe underwent a brief period of incredible expansion known as inflation in its formative moments. The commercial fabrication effort is intended to benefit this CMB-S4 experiment, which will require a total of about 500,000 detectors. The current design calls for about 400 detector wafers that will each feature more than 1,000 detectors arranged on hexagonal silicon wafers measuring about six inches across. The wafers are designed to be tiled together in telescope arrays. Suzuki, who is part of a scientific board working on CMB-S4 along with other Berkeley Lab scientists, is collaboring with Adrian Lee, another board member who is also a physicist at Berkeley Lab and a UC Berkeley physics professor. It was Lee who pioneered microfabrication techniques at UC Berkeley to help speed the production of TES-containing detectors. In addition to the detector production at UC Berkeley’s nanofabrication laboratory, researchers have also built specialized superconducting readout electronics in a nearly dustless clean room space within the Microsystems Laboratory at Berkeley Lab. Before the introduction of higher-throughput manufacturing processes, detectors “were made one by one, by hand,” Suzuki noted. Suzuki labored to develop the latest 6-inch wafer design, which offers a production throughput advantage over the previously used 4-inch wafer designs. Older wafers had only about 100 detectors, which would have required the production of many more wafers to fully outfit a CMB-S4 experiment. The current detector design incorporates niobium, a superconducting metal, and other uncommon metals like palladium and manganese-doped aluminum alloy. “These are very unique metals that normally companies don’t touch. We use them to achieve the unique properties that we desire for these detectors,” Suzuki said. The effort has benefited from a Laboratory Directed Research and Development grant that Lee received in 2015 to explore commercial fabrication of the detectors. Also, the research team has received support from the federally supported Small Business Innovation Research program, and Suzuki has also received support from the DOE Early Career Research Program. Suzuki has worked with Hypres Inc. of New York and STAR Cryoelectronics of Santa Fe, New Mexico, on the fabrication processes for the detectors, and worked with the University of New Mexico and STAR Cryoelectronics on the SQUID amplifiers. Suzuki said that working with the companies has been a productive process. “They gave us a lot of ideas,” he said, to help improve and streamline the processes. The industry-produced SQUID amplifiers will be used in one of the telescopes of the POLARBEAR-2/Simons Array experiment, Suzuki noted, and the design of these amplifiers could drive improvements in the readout electronics of a CMB-S4 experiment. As a next step in the effort to commercially fabricate detectors, a test run is planned this year to demonstrate fabrication quality and throughput.
<urn:uuid:649b4804-48c3-42fa-8a1b-55a69af15109>
CC-MAIN-2021-04
https://www.rdworldonline.com/mass-producing-detectors-for-next-gen-cosmic-experiments/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703520883.15/warc/CC-MAIN-20210120120242-20210120150242-00044.warc.gz
en
0.929636
1,588
3.859375
4
These days, losing the manual for some piece of electronics you’ve purchased is notable mostly because you had a printed document to lose in the first place. In the dead-tree dominated days of yore, of course, this was less true. Documentation loss is a major problem in the effort to understand old computer systems, and it’s part of what drives ongoing data preservation efforts across the industry. Until recently, the Zuse Z4 could have been a poster child for this sort of problem. The Z4 was the brainchild of Konrad Zuse, a German who deserves to be better known than he is for his early, groundbreaking work. Zuse had the misfortune to be making some of his biggest breakthroughs immediately prior to and during World War II. It was Zuse who designed the first high-level programming language from 1942 to 1945. This is remarkable because, as Wikipedia notes, Zuse had no training whatsoever in mechanical computing devices. He independently discovered both propositional calculus and lattice theory, calling them “combinatorics of conditionals” and “study of intervals,” respectively. The Zuse Z4 is the oldest preserved digital computer in the world and arguably* the first digital computer. The Z4 was developed through the end of the war and was moved multiple times while under construction to keep it away from the advancing Soviet army. After the war, it was expanded and became the second digital computer in the world to be sold. The preserved model is on display at the Deutsches Museum in Munich and is pictured above. Its documentation, however, was a different story. A recent blog post by the Association of Computing Machinery details how the rare documents were found. Archivist Evelyn Boesch, with ETH Zurich University, contacted Herbert Bruder of the ACM and informed him that her father, René Boesch, had kept a tranche of rare historical documents. These turned out to include a user manual for the Z4 Zuse, as well as notes on flutter calculations. Other documents, dated October 27, 1953, detail what the Z4 was working on. At the time, it was being used to perform flutter calculations on the Swiss FFA P-16 fighter aircraft, which was then in development. Details from the recovered documents show that it took the Z4 50 hours to simulate 2.4 seconds of flight time, which is slightly worse than the current version of Microsoft Flight Simulator. The ACM blog post notes that “around 100 jobs were carried out with the Z4 between 1950 and 1955,” implying an average per-job computation time of about three weeks. What We Learn From Manuals Like This The recovered Z4 manual illustrates why this type of document preservation is so important. From their earliest days, computers were upgradeable — machines like ENIAC were outfitted with the equivalent of RAM upgrades and CPU improvements. In the Z4’s case, support for conditional jump instructions was added post-manufacture. The only problem was, nobody could remember exactly how the feature worked. ACM notes: “However, in a survey a few years ago, the few surviving eyewitnesses could not remember how it was executed.” Page 8 of the manual provides this information. My German is rusty, my technical German is nonexistent, and frankly, the images are a bit tough to read, so I’m not going to try to translate exactly how the function worked. Without information like this, it would be impossible to precisely replicate or understand how the Z4 embodied or improved upon the computational capabilities of the time. *The answer to “Who invented the first computer?” is essentially arbitrary and depends entirely on how you choose to define the term “computer.” The UK’s Colossus is declared the world’s first “programmable, electronic, digital computer,” by Wikipedia, but it was programmed by switches and plugs, not a stored program. The Z4 is considered to be the first commercial digital computer but it’s not electronic. The first electronic stored-program computer is the Manchester Baby, but Konrad Zuse’s earlier Z3 could store programs on tape — it just wasn’t electronic. Other obscure machines, like the Atanasoff-Berry Computer, were not Turing-complete and couldn’t store programs, but still contributed critical ideas to the development of computing. Also, if you were taught that ENIAC was the first computer (or digital computer, or electronic digital computer, etc, ad nauseam), that’s more propaganda than fact. ENIAC was more directly based on machines like Colossus than was known at the time, because the wartime efforts of the British remained classified, while ENIAC was widely celebrated in the media. Finally, reading up on the history of early computing is a good reminder of how many people, institutions, and companies contributed various technologies and principles to the field. One reason you can subdivide the question of “Who built the first computer” to such a fine degree is that there were so many “firsts” for someone to achieve. There was a time in the 1930s and 1940s when mechanical, electromechanical, and digital systems were sharing space and serious development dollars simultaneously. We don’t have anything remotely equivalent today, and even our wildest architectural departures from the x86 “norm” are still based on digital computing. That could change in the future, if Intel’s MESO architecture comes to fruition and proves capable of replacing CMOS in the long term. But for now, the 1930s and 1940s represent a tremendously dynamic period in computing history that we don’t really have an equivalent for — though some of the quantum computing work is getting really interesting. - Fujitsu Has an Employee Who Keeps a 1959 Computer Running - Happy 42nd Anniversary to the Original Intel 8086 and the x86 Architecture - Apple to Open Source Its First Graphical OS From the Lisa
<urn:uuid:aa5ae425-bb85-4e2e-a53e-d9837b4ef5ce>
CC-MAIN-2021-04
https://www.cleburnepcrepair.com/uncategorized/we-just-found-the-user-manual-for-the-first-digital-computer-ever-built/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704824728.92/warc/CC-MAIN-20210127121330-20210127151330-00646.warc.gz
en
0.969555
1,253
3.65625
4
In this article, we talked about quantum entanglement. While doing so we touched on “action at a distance”. In this article we are going to use it as an introduction to the Principle of Physical Interactivity. Let us summarize action at a distance with a quote from the article: Suppose you have two particles, particle A and particle B. Suppose these two particles can interact in some way such that if particle A does something, it will cause particle B to change state. Perhaps if particle A emits a smaller particle that strikes B, particle B will spin in a different direction. We will call that change in direction “event C”. If particle A and particle B are to interact to cause event C, then some kind of physical action must occur. The particles must act upon each in some way which then causes event C. Particle A will have to emit some particle, vibrate some physical connection between particle A and particle B. Or somehow affect some kind of physical interaction. An interaction being some kind of action taken by A which effects B. Otherwise, how else could particle A cause particle B to change the direction of its spin? By non-physical means? Using an abstraction? I think not… Let us try to put this more simply. Let us say that we have a computer and a wireless keyboard connected to the computer. It is connected by a wireless connection. We will use these two macroscopic objects as our example. However, it is trivial to extend these examples to subatomic particles and apply some simple logic to them. Even though modern physics insists the subatomic world is not rational. Or that it is not subject to the laws of logic. Now, suppose we press the button “A” on the keyboard. As a result, the letter “A” now appears on the screen. In other words, our keyboard has interacted with the computer. Now, what has happened here? Is this witchcraft? Should we expect the Spanish Inquisition? I am going to explain this by laying down a simple principle, which I am going to call the Principle of Physical Interactivity. The Principle of Physical Interactivity says that in physics, all objects that interact with one another do so by physically interacting with one another. All interactions in physics are the interactions of physical objects with each other. What does this mean? What do I mean by physical? I mean that which has shape or “physical extension”. It is not an abstraction, not an attribute, not a relationship, not an action. It is a non-abstract entity. So, when I say that all objects interact via physical means, I mean that this interaction takes place when two physical objects act upon one another. The interaction is not by means of abstractions. It is via the actions of physical entities. Take the computer and the keyboard. Are they physically interacting? Yes. There are physical objects of some kind traveling from the keyboard to the computer. Or some other kind of physical activity in the keyboard which causes another physical action to take place in the computer. That interaction might be described by saying that there are waves traveling from the keyboard to the computer. The waves are abstract descriptions of some kind of motion/relationship. In that case, the keyboard interacts with the computer by some physical process involving the keyboard interacting with some kind of physical medium. The point of the Principle of Physical Interactivity is that some kind of physical interaction is required. Must Objects Touch? What does it mean for objects to touch? I take it that they must have direct physical contact. Is this necessary? No. The Principle of Physical Interactivity merely says that there must be physical interaction. It does not require that two objects are in direct physical contact. Let us return to the example of our computer and wireless keyboard. For the wireless keyboard to send a signal to the computer, must the keyboard and the computer be touching? Must they be in direct contact via some part of each other? The Principle of Physical Interactivity does not require this. It merely requires some kind of physical interaction. It does not require that the computer and the keyboard directly touch one another. Other Forms of Contact? Must there be some kind of invisible thread directly connecting the keyboard and the computer? The Principle of Physical Interactivity does not require this either. Physical interaction need not take place via objects such as a thread that directly connects the two objects. How then can they interact? Well, the keyboard might send waves through a medium such as air, which the computer picks up. Hold on now, I thought waves were abstractions? Additionally, I thought you said that the computer and the keyboard must interact by physical means? The wave is an abstract description of objects taking some kind of action, of causing something to move through the air in a wave pattern and to hit the computer. Therefore, the keyboard and the computer do interact by physical means. (Note that here on this site, we define waves as a kind of abstraction that describes motion or some other kind of relationship. Thus when we say “there is a water wave”, what we are talking about is an abstract description of a bunch of water molecules arranged in that shape. The referents of the concept of wave is the water molecules, the concept of “wave” describes the fact that they related in that pattern). This is still a kind of physical interaction between the computer and the keyboard that causes the letter “A” to appear on the screen. To sum up, this physical interaction does require some form of physical interaction via touch or some other form of medium.
<urn:uuid:de8c48d2-c863-48e3-96ef-59b6b2d1a570>
CC-MAIN-2021-04
https://metaphysicsofphysics.com/the-principle-of-physical-interactivity/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703548716.53/warc/CC-MAIN-20210124111006-20210124141006-00047.warc.gz
en
0.944278
1,170
3.59375
4
In a major step forward for an area of research that earned the 2016 Nobel Prize in Physics, an international team has found that substances with exotic electronic behaviors called topological materials are in fact quite common, and include everyday elements such as arsenic and gold. The team created an online catalog to make it easy to design new topological materials using elements from the periodic table. These materials have unexpected and strange properties that have shifted scientists' understanding of how electrons behave. Researchers hope these substances could form the basis of technologies of the future, such as low-power devices and quantum computing. "Once the analysis was done and all the errors corrected, the result was astonishing: more than a quarter of all materials exhibit some sort of topology," said B. Andrei Bernevig, a senior author on the paper and professor of physics at Princeton. "Topology is ubiquitous in materials, not esoteric." Topological materials are intriguing because their surfaces can conduct electricity without resistance, so they are potentially faster and more energy-efficient than today's technologies. Their name comes from an underlying theory that draws on topology, a branch of mathematics that describes objects by their ability to be stretched or bent. The beginnings of the theoretical understanding of these states of matter formed the basis of the 2016 Nobel Prize in Physics, shared among Princeton University professor F. Duncan Haldane, the Sherman Fairchild University Professor of Physics, J. Michael Kosterlitz of Brown University, and David J. Thouless, University of Washington-Seattle. Until now, only a few hundred of the more than 200,000 known inorganic crystalline materials have been characterize as topological, and they were thought to be anomalies. "When fully completed, this catalog will usher in a new era of topological material design," Bernevig said. "This is the beginning of a new type of periodic table where compounds and elements are indexed by their topological properties rather than by more traditional means." The international team included researchers from Princeton; the Donostia International Physics Center in San Sebastian, Spain; the IKERBASQUE Basque Foundation for Science; the University of the Basque Country; École Normale Supérieure Paris and the French National Center for Scientific Research; and the Max Planck Institute for Chemical Physics of Solids. The team investigated about 25,000 inorganic materials whose atomic structures are experimentally known with precision, and classified in the Inorganic Crystal Structure Database. The results show that rather than being rare, more than 27 percent of materials in nature are topological. The researchers made the newly created online database available at www.topologicalquantumchemistry.com. It allows visitors to select elements from the periodic table to create compounds that the user can then explore for its topological properties. More materials are currently being analyzed and placed in a database for future publication. Two factors allowed the complex task of topologically classifying the 25,000 compounds. First, two years ago, some of the present authors developed a theory, known as topological quantum chemistry and published in Nature in 2017, which allowed for the classification of the topological properties of any material from the simple knowledge of the positions and nature of its atoms. Second, in the current study, the team applied this theory to the compounds in the Inorganic Crystal Structure Database. In doing so, the authors needed to devise, write and modify a large number of computerized instructions to calculate the energies of electrons in the materials. “We had to go into these old programs and add new modules that would compute the required electronic properties,” said Zhijun Wang, who was a postdoctoral research associate at Princeton and is now a professor at the Beijing National Laboratory for Condensed Matter Physics and the Institute of Physics, Chinese Academy of Sciences. “We then needed to analyze these results and compute their topological properties based on our newly developed topological quantum chemistry methodology," said Luis Elcoro, a professor at the University of the Basque Country in Bilbao, Spain. The authors wrote several sets of codes that obtain and analyze the topology of electrons in real materials. The authors have made these codes available to the public through the Bilbao Crystallographic Server. With the help of the Max Planck Supercomputer Center in Garching, Germany, the researchers then ran their codes on the 25,000 compounds. "Computationally, it was pretty incredibly intensive stuff," said Nicolas Regnault, a professor at École Normale Supérieure, Paris, and a researcher at the French National Center for Scientific Research. "Fortunately, the theory showed us that we need to compute only a fraction of the data that we needed previously. We need to look at what the electron 'does' only in part of the parameter space to obtain the topology of the system." "Our understanding of materials got much richer because of this classification," said Maia Garcia Vergniory, a researcher at Donostia International Physics Center in San Sebastian, Spain. "It is really the last line of understanding of properties of materials." Claudia Felser, a professor at the Max Planck Institute for Chemical Physics of Solids in Dresden, Germany, had earlier predicted earlier that even gold is topological. "A lot of the material properties that we know — such as the color of gold — can be understood through topological reasoning," Felser said. The team is now working to classify the topological nature of additional compounds in the database. The next steps involve identifying the compounds with the best versatility, conductivity and other properties, and experimentally verifying their topological nature. "One can then dream about a full topological periodic table," Bernevig said. An article accompanying the database was published in the journal Nature on Feb. 28. The study, "A complete catalogue of high-quality topological materials." By M. G. Vergniory, L. Elcoro, Claudia Felser, Nicolas Regnault, B. Andrei Bernevig and Zhijun Wang, was published online in the journal Nature on Feb. 28. DOI 10.1038/s41586-019-0954-4 Luis Elcoro was supported by the Government of the Basque Country (project IT779-13), the Spanish Ministry of Economy and Competitiveness (MINECO), and the European Fund for Economic and Regional Development (FEDER; project MAT2015-66441-P). Maia G. Vergniory was supported by MINECO (project IS2016-75862-P). B. Andrei Bernevig and Zhijun Wang acknowledge support for the analytical work from the U.S. Department of Energy (DE-SC0016239), a Simons Investigator Award, the David and Lucile Packard Foundation, and the Eric and Wendy Schmidt Transformative Technology Fund. The computational part of the Princeton work was performed under the National Science Foundation (NSF) Early-concept Grants for Exploratory Research (EAGER): DMR 1643312 NOA-AWD1004957, ONR-N00014-14-1-0330, ARO MURI W911NF-12-1-0461 and NSF-MRSECDMR-1420541.
<urn:uuid:2317147d-2d0c-4382-b0ba-ad2dacb80cb8>
CC-MAIN-2021-04
https://cefr.princeton.edu/news/good-news-future-tech-exotic-topological-materials-are-surprisingly-common
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704821253.82/warc/CC-MAIN-20210127055122-20210127085122-00248.warc.gz
en
0.92275
1,522
3.859375
4
A new machine learning tool can calculate the energy required to make—or break—a molecule with higher accuracy than conventional methods. While the tool can currently only handle simple molecules, it paves the way for future insights in quantum chemistry. “Using machine learning to solve the fundamental equations governing quantum chemistry has been an open problem for several years, and there’s a lot of excitement around it right now,” says co-creator Giuseppe Carleo, a research scientist at the Flatiron Institute’s Center for Computational Quantum Physics in New York City. A better understanding of the formation and destruction of molecules, he says, could reveal the inner workings of the chemical reactions vital to life. Carleo and collaborators Kenny Choo of the University of Zurich and Antonio Mezzacapo of the IBM Thomas J. Watson Research Center in Yorktown Heights, New York, present their work May 12 in Nature Communications. The team’s tool estimates the amount of energy needed to assemble or pull apart a molecule, such as water or ammonia. That calculation requires determining the molecule’s electronic structure, which consists of the collective behavior of the electrons that bind the molecule together. A molecule’s electronic structure is a tricky thing to calculate, requiring the determination of all the potential states the molecule’s electrons could be in, plus each state’s probability. Since electrons interact and become quantum mechanically entangled with one another, scientists can’t treat them individually. With more electrons, more entanglements crop up, and the problem gets exponentially harder. Exact solutions don’t exist for molecules more complex than the two electrons found in a pair of hydrogen atoms. Even approximations struggle with accuracy when they involve more than a few electrons. One of the challenges is that a molecule’s electronic structure includes states for an infinite number of orbitals going farther and farther from the atoms. Additionally, one electron is indistinguishable from another, and two electrons can’t occupy the same state. The latter rule is a consequence of exchange symmetry, which governs what happens when identical particles switch states. Mezzacapo and colleagues at IBM Quantum developed a method for constraining the number of orbitals considered and imposing exchange symmetry. This approach, based on methods developed for quantum computing applications, makes the problem more akin to scenarios where electrons are confined to preset locations, such as in a rigid lattice. The similarity to rigid lattices was the key to making the problem more manageable. Carleo previously trained neural networks to reconstruct the behavior of electrons confined to the sites of a lattice. By extending those methods, the researchers could estimate solutions to Mezzacapo’s compacted problems. The team’s neural network calculates the probability of each state. Using this probability, the researchers can estimate the energy of a given state. The lowest energy level, dubbed the equilibrium energy, is where the molecule is the most stable. The team’s innovations made calculating a basic molecule’s electronic structure simpler and faster. The researchers demonstrated the accuracy of their methods by estimating how much energy it would take to pull a real-world molecule apart, breaking its bonds. They ran calculations for dihydrogen (H2), lithium hydride (LiH), ammonia (NH3), water (H2O), diatomic carbon (C2) and dinitrogen (N2). For all the molecules, the team’s estimates proved highly accurate even in ranges where existing methods struggle. In the future, the researchers aim to tackle larger and more complex molecules by using more sophisticated neural networks. One goal is to handle chemicals like those found in the nitrogen cycle, in which biological processes build and break nitrogen-based molecules to make them usable for life. “We want this to be a tool that could be used by chemists to process these problems,” Carleo says. Carleo, Choo and Mezzacapo aren’t alone in tapping machine learning to tackle problems in quantum chemistry. The researchers first presented their work on arXiv.org in September 2019. In that same month, a group in Germany and another at Google’s DeepMind in London each released research using machine learning to reconstruct the electronic structure of molecules. The other two groups use a similar approach to one another that doesn’t limit the number of orbitals considered. This inclusiveness, however, is more computationally taxing, a drawback that will only worsen with more complex molecules. With the same computational resources, the approach by Carleo, Choo and Mezzacapo yields higher accuracy, but the simplifications made to obtain this accuracy could introduce biases. “Overall, it’s a trade-off between bias and accuracy, and it’s unclear which of the two approaches has more potential for the future,” Carleo says. “Only time will tell us which of these approaches can be scaled up to the challenging open problems in chemistry.” More information: Kenny Choo et al. Fermionic neural-network states for ab-initio electronic structure. Nature Communications (2020). DOI: 10.1038/s41467-020-15724-9 Image: The tetrahedral electronic distribution of a water molecule. The oxygen atom nucleus is at the center of the tetrahedron, and the hydrogen nuclei are in the center of the pink spheres. Simons Foundation. Credit: Simons Foundation
<urn:uuid:b7640861-b4fe-4e18-8ed7-383697faaef6>
CC-MAIN-2021-04
https://sciencebulletin.org/machine-learning-cracks-quantum-chemistry-conundrum/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703495901.0/warc/CC-MAIN-20210115134101-20210115164101-00290.warc.gz
en
0.912152
1,141
3.546875
4
During the past months we’ve been reporting several breakthroughs in the field of quantum computing, and now IBM seems ready to truly pave the way for quantum computers. Researchers announced they are now able to develop a superconducting qubit made from microfabricated silicon that maintains coherence long enough for practical computation. Whoa! That probably sounds like a lot to swallow, so let’s break it down. Bits and Qubits Information is measured in ‘bits’, and a bit may have two positions (described typically as 0 or 1). Quantum computers however don’t use these bits, and instead they use quantum bits, or ‘qubits’. But while a bit must be a 0 or a 1, a qubit can be both 0, 1, or a superposition of both. This difference might seem small and subtle, but in fact, it is absolutely humongous: a mere hundred qubits can store more classical ‘bit’ information than there are atoms in the Universe. Needless to say a computer running on qubits would be game changing, in pretty much the same way microprocessors were in their days. But what makes quantum computing extremely difficult is a problem called ‘decoherence‘. In the quantum world, things don’t happen as they do in the ‘real world’; when a qubit will move from the 0 state to the 1 state or to a superposition, it will decohere to state 0 due to interference from other parts of the computer. Generally speaking, decoherence is the loss order of the phase angles between the components. So in order for quantum computers to be practical and scalable, the system would have to remain coherent for a long enough time to allow error-correction techniques to function properly. “In 1999, coherence times were about 1 nanosecond,” said IBM scientist Matthias Steffen. “Last year, coherence times were achieved for as long as 1 to 4 microseconds. With these new techniques, we’ve achieved coherence times of 10 to 100 microseconds. We need to improve that by a factor of 10 to 100 before we’re at the threshold we want to be. But considering that in the past ten years we’ve increased coherence times by a factor of 10,000, I’m not scared.” Two different approaches, one breakthrough IBM announced they took two different approaches, both of which played a significant part in the breakthrough they revealed. The first one was to build a 3-D qubit made from superconducting, microfabricated silicon. The main advantage here is that the equipment and know-how necessary to create this technology already exists, nothing new has to be invented, thanks to developments made by Yale researchers (for which Steffen expressed a deep admiration). Using this approach, they managed to maintain coherence for 95 microseconds – “But you could round that to 100 for the piece if you want,” Steffen joked. The second idea involved a traditional 2-D qubit, which IBM’s scientists used to build a “Controlled NOT gate” or CNOT gate, which is a building block of quantum computing. A CNOT gate connects two qubits in such a way that the second qubit will change state if the first qubit changes its state to 1. The CNOT gate was able to produce a coherence of 10 microseconds, which is long enough to show a 95% accuracy rate – a notable improvement from the 81% accuracy rate, the highest achieved until now. Of course, the technology is still years away from being actually on the shelves, but the developments are very impressive. From quantum to reality Given the rapid progress that is being made in the field of quantum computing, one can only feel that a quantum computer is looking more and more like a real possibility. As error correction protocols become more accurate and coherence times grow longer, we are moving more and more towards accurate quantum computing – but you shouldn’t expect a quantum smartphone just yet. “There’s a growing sense that a quantum computer can’t be a laptop or desktop,” said Steffen. “Quantum computers may well just being housed in a large building somewhere. It’s not going to be something that’s very portable. In terms of application, I don’t think that’s a huge detriment because they’ll be able to solve problems so much faster than traditional computers.” The next steps are simple, in principle, but extremely hard to do in practice. The accuracy rate has to be at at least 99.99%, up to the point where it achieves what is called a ‘logical qubit’ – one that, for practical purposes, doesn’t suffer decoherence. From that point, the only thing left to do is develop the quantum computer architecture, and this will prove troublesome too – but the reward is definitely worth it. “We are very excited about how the quantum computing field has progressed over the past ten years,” he told me. “Our team has grown significantly over past 3 years, and I look forward to seeing that team continue to grow and take quantum computing to the next level.”
<urn:uuid:98062559-ed15-41c5-91f5-1353b9900451>
CC-MAIN-2021-04
https://www.zmescience.com/research/ibm-quantum-computer-28022012/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703522242.73/warc/CC-MAIN-20210121035242-20210121065242-00050.warc.gz
en
0.956899
1,117
3.84375
4
Quantum Machine Learning: An Overview Quantum Machine Learning (Quantum ML) is the interdisciplinary area combining Quantum Physics and Machine Learning(ML). It is a symbiotic association- leveraging the power of Quantum Computing to produce quantum versions of ML algorithms, and applying classical ML algorithms to analyze quantum systems. Read this article for an introduction to Quantum ML. At a recent conference in 2017, Microsoft CEO Satya Nadella used the analogy of a corn maze to explain the difference in approach between a classical computer and a quantum computer. In trying to find a path through the maze, a classical computer would start down a path, hit an obstruction, backtrack; start again, hit another obstruction, backtrack again until it ran out of options. Although an answer can be found, this approach could be very time-consuming. In contrast, quantum computers “unlock amazing parallelism. They take every path in the corn maze simultaneously.” Thus, leading to an exponential reduction in the number of steps required to solve a problem. The parallelism comes from the concept of ‘qubit’, 'superposition' and 'entanglement' derived from Quantum Physics. I. Quantum Computing: Quantum is the smallest possible unit of any physical entity, such as energy or mass. In 1900, Max Planck proposed that, at the atomic and subatomic level, energy of a body is contained in discrete packets called quanta’. Wave-particle duality is the characteristic of quantic particles to behave as a wave sometimes and as a particle the other times, depending on the environment. Quantum theory is characterized by finding the probability of, and not the exact location of, a particle at a given point x in space. Fig 1: The dual nature of light, which acts like both particles and waves. (Source) A classical computer performs operations using classical ‘bits’, which are either 0 OR 1. However, a quantum computer uses quantum bits, also called ‘qubits’ to perform operations. Qubits can be represented by: - An electron orbiting a nucleus: where |1> and |0> are the excited state and ground state respectively - A photon: where |1> and |0> are polarizations of the photon. Qubits exist as both 0 AND 1 at the same time. This phenomenon is called ‘superposition’. Although a particle can exist in multiple quantum states, once we measure that particle for its energy or position, its superposition is lost and it then exists in only one state. Fig2 : The qubit is defined as a pair of complex vectors pointing to a spot on a unit sphere. Traditionally, a qubit pointing directly up (positive on the axis) is denoted as the column vector |0⟩ and the vector pointing down is known as |1⟩. (For example, in this case, the qubit is |0⟩). ‘Quantum entanglement’ is the phenomenon in which quantum particles interact with each other and are described with reference to each other, not independently, even if the particles are separated by a large distance. At the time of measurement, if one entangled particle in a pair is decided to be in the spin state of ‘down’ (that is, the lowest energy state; when the electron is in alignment with its magnetic field), then this decision is communicated to the other correlated particle that now assumes the opposite spin state of ‘up’. Quantum entanglement allows qubits, including those faraway, to interact instantaneously with each other. How does Quantum computing unlock immense parallelism? Two interacting classical bits can take one of 4 forms: 00 or 01 or 10 or 11. Each of these 2 components of information- the first bit and the second bit, combine to represent only one binary configuration at a given time. Adding more bits to a regular computer would still represent a single binary configuration. Fig3: One qubit in superposition before measurement, with its probabilities of ‘spin-up’ AND ‘spin-down'. (Source) One qubit can exist in both states (0 AND 1) at once. Thus, two interacting qubits can store all 4 binary configurations simultaneously. In general, ‘n’ qubits can simultaneously represent ‘2^n’ classical binary configurations. Thus, a 300–qubit quantum computer can explore 2^300 possible solutions at the same time, unlike 1 solution at a time in a classical computer, causing immense parallelism. Adding more qubits to a quantum computer would exponentially increase the power of the computer. A fully quantum computer has not yet been realized because adding more qubits and dealing with subatomic particles that require a low temperature of -452 F in order to be stable, is daunting and building a computer around that (a ‘quantum computer’), even more so. Thus, efforts are on to ‘simulate’ 40 qubit operations using Microsoft’s quantum simulator- LIQUi|> , extended by Microsoft Azure’s cloud computing resources. Quantum Computing can solve specialized scientific problems such as molecular modelling, creation of high-temperature superconductors, drug modelling and testing, selection of molecules for the creation of organic batteries. It is not optimal for general-purpose tasks such as for watching videos or writing a Word document. Now, how does Quantum Computing fit in with Machine Learning? II. Quantum ML: 2a) Quantum versions of ML algorithms - Finding eigenvalues and eigenvectors of large matrices: One of the methods to perform the classical PCA algorithm is to take the eigenvalue decomposition of a data covariance matrix. However, this is not so efficient in case of high dimensional data. Quantum PCA of an unknown low-rank density matrix, can reveal the quantum eigenvectors associated with the large eigenvalues, exponentially faster than a linearly-scaled classical algorithm. - Finding nearest neighbours on a quantum computer: The quantum algorithms presented here for computing nearest neighbours that are used in supervised and unsupervised learning, place an upper bound on the number of queries to the input data required to compute distance metrics such as the Euclidean distance and inner product. The best cases show exponential and super-exponential reductions in query complexity and the worst case still shows polynomial reduction in query complexity over the classical analogue. Top Stories Past 30 Days
<urn:uuid:2508479b-e727-4cd3-a6c4-16fc2c9b81cb>
CC-MAIN-2021-04
https://www.kdnuggets.com/2018/01/quantum-machine-learning-overview.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703519395.23/warc/CC-MAIN-20210119135001-20210119165001-00652.warc.gz
en
0.910248
1,358
3.5625
4
Quantum computers have long been touted as incredibly powerful machines that will be able to solve hugely complex computational problems much faster than any computer we have available today. But no-one can agree on the best way to make them. Who will win the race? Superfast quantum computers could speed up the discovery of new medicines, crack the most complex cryptographic security systems, design new materials, model climate change, and supercharge artificial intelligence, computer scientists say. But there’s currently no consensus on the best way to make them or how to make them available to the mass market. Physicists, engineers and computer scientists around the world are trying to develop four very different types of quantum computers, based around light particles, trapped ions, superconducting qubits, or nitrogen-vacancy centres in diamonds. Companies like IBM, Google, Rigetti, Intel and Microsoft are currently leading the quantum charge. Each method has its pros and cons, but the overarching challenge is the fragile nature of quantum itself. What is quantum computing? Instead of using ones and noughts called bits, representing on or off, in long sequences as in classical computing a quantum bit – or qubit – uses the near magical properties of sub-atomic particles. Electrons or photons, for example, can be in two states at the same time – a phenomenon called superposition. As a result, a qubit-based computer can do far more calculations much faster than a conventional computer. “If you had a two-qubit computer and you add two qubits, it becomes a four-qubit computer. But you’re not doubling the computer power, you’re increasing it exponentially,” explains Martin Giles, San Francisco bureau chief of the MIT Technology Review. Computer scientists sometimes describe this quantum computing effect as like being able to go down each path of a very complex maze at the same time. Qubits can also influence each other even when they’re not physically connected, a process called “entanglement”. In computing terms, this gives them the ability to make logical leaps conventional computers never could. The search for stability But qubits are highly unstable and prone to interference or “noise” from other sources of energy, leading to errors in calculations. So the race is one to find a way to stabilise them for mass-production. Computing giant IBM firmly believes that “transmon superconducting qubits” hold the most promise for quantum computing, and they have three prototype quantum processors that the public can access in the cloud. “So far, over 94,000 people have accessed IBM quantum computers in the cloud. They’ve run over five million experiments and written 110 papers,” says Dr Robert Sutor, vice president for quantum computing strategy and ecosystem at IBM Research. “People are learning and experimenting… we hope in three to five years to be able to point at one specific example, and say that quantum significantly improves on anything classical computers can do.” But IBM’s method required the quantum computer to be stored within a large fridge, where the qubits are stored at temperatures close to absolute zero to ensure that they remain in their useful states. This takes a lot of energy and means it would be extremely hard to miniaturise. “It seems likely that superconducting qubits will be among the first technologies to enable useful quantum computation,” says Joseph Fitzsimons, a principal investigator at the National University of Singapore’s Centre of Quantum Technologies. “However, my impression is that they are analogous to vacuum tubes in early computers, rather than transistors which came along later. “We may yet see another technology emerge which becomes the ultimate winner.” Microsoft and academics at the Niels Bohr Institute in Copenhagen are working on what they believe will be much more stable qubits based on so-called Majorana particles. While other teams are working on trapping qubits in silicon – the material traditional computer chips have been made from. And computer scientists at Oxford University are looking at ways to link smaller qubit computers rather than creating bigger computers with lots of qubits. There are many ways to skin Schrodinger’s Cat it seems. While we wait for quantum computers, what’s the future for conventional, or classical, computing? In July, Ewin Tang, an 18-year-old graduate in computer science and mathematics from the University of Texas at Austin, made waves in the international computing world by developing a classical computer algorithm that can solve a problem almost as fast as a quantum computer. The problem involved developing a recommendation engine that suggests products to users based on data about their preferences. And the EU recently announced it is working on the next generation of computers – exascale – which would enable a billion billion calculations per second. “Exascale means 10 to the power of 18 operations per second,” explains says Prof Scott Aaronson, a theoretical computer scientist at UT Austin who mentored Mr Tang. “10 to the power of 18 is big, but quantum systems, which will be capable of 10 to the power of 1,000 operations per second, is much, much bigger.” And the problem for classical computing is that we are reaching the limits of how many transistors we can fit onto a chip – Apple’s A11 squeezes in an astonishing 4.3 billion, for example. Moore’s Law – that every two years, microprocessors will get twice as fast, use half as much energy, and take up half as much space – is finally breaking down. Even if a stable, mass-produced quantum computer always remains elusive, the research is already yielding interesting results. “If we hadn’t invested in quantum computing, the quantum algorithm that inspired Mr Tang wouldn’t have existed,” says Prof Robert Young, a Royal Society research fellow and director of the University of Lancaster’s Quantum Technology Centre. More Technology of Business Already, he says that quantum research has yielded a new way to cool devices to low temperatures; light-based chip enhancements that have improved the fibre optic broadband experience; and the invention of lab-on-a-chip technologies to speed up the diagnosis of illnesses. “The real benefit of going to the Moon wasn’t going to the Moon, it was the peripheral technologies that were developed on the way,” says Prof Young – GPS satellite navigation and ball point pens, to name but a few.
<urn:uuid:7a9ab265-dd82-43e0-9d3e-45d62b1ff4a8>
CC-MAIN-2021-04
https://www.earthinfonow.com/the-race-to-make-the-worlds-most-powerful-computer-ever/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704821381.83/warc/CC-MAIN-20210127090152-20210127120152-00452.warc.gz
en
0.934692
1,371
3.78125
4
An Entirely New Type of Quantum Computing Has Been Invented Australian researchers have designed a new type of qubit – the building block of quantum computers – that they say will finally make it possible to manufacture a true, large-scale quantum computer. Broadly speaking, there are currently a number of ways to make a quantum computer. Some take up less space, but tend to be incredibly complex. Others are simpler, but if you want it to scale up you’re going to need to knock down a few walls. Some tried and true ways to capture a qubit are to use standard atom-taming technology such as ion traps and optical tweezers that can hold onto particles long enough for their quantum states to be analysed. Others use circuits made of superconducting materials to detect quantum superpositions within the insanely slippery electrical currents. The advantage of these kinds of systems is their basis in existing techniques and equipment, making them relatively affordable and easy to put together. The cost is space – the technology might do for a relatively small number of qubits, but when you’re looking at hundreds or thousands of them linked into a computer, the scale quickly becomes unfeasible. Thanks to coding information in both the nucleus and electron of an atom, the new silicon qubit, which is being called a ‘flip-flop qubit’, can be controlled by electric signals, instead of magnetic ones. That means it can maintain quantum entanglement across a larger distance than ever before, making it cheaper and easier to build into a scalable computer. “If they’re too close, or too far apart, the ‘entanglement’ between quantum bits – which is what makes quantum computers so special – doesn’t occur,” says the researcher who came up with the new qubit, Guilherme Tosi, from the University of New South Wales in Australia. The flip-flop qubit will sit in the sweet spot between those two extremes, offering true quantum entanglement across a distance of hundreds of nanometres. In other words, this might be just what we’ve been waiting for to make silicon-based quantum computers scalable. To be clear, so far we only have a blueprint of the device – it hasn’t been built as yet. But according to team leader, Andrea Morello, the development is as important for the field as the seminal 1998 paper in Nature by Bruce Kane, which kicked off the silicon quantum computing movement. “Like Kane’s paper, this is a theory, a proposal – the qubit has yet to be built,” says Morello. “We have some preliminary experimental data that suggests it’s entirely feasible, so we’re working to fully demonstrate this. But I think this is as visionary as Kane’s original paper.” The flip-flop qubit works by coding information on both the electron AND nucleus of a phosphorus atom implanted inside a silicon chip, and connected with a pattern of electrodes. The whole thing is then chilled to near absolute zero and bathed in a magnetic field. The qubit’s value is then determined by combinations of a binary property called spin – if the spin is ‘up’ for an electron while ‘down’ for the nucleus, the qubit represents an overall value of 1. Reversed, and it’s a 0. That leaves the superposition of the spin-states to be used in quantum operations. In flip-flop, researchers are able to control the qubit using an electric field instead of magnetic signals – which gives two advantages. It is easier to integrate with normal electronic circuits and, most importantly, it also means qubits can communicate over larger distances. “To operate this qubit, you need to pull the electron a little bit away from the nucleus, using the electrodes at the top. By doing so, you also create an electric dipole,” says Tosi. “This is the crucial point,” adds Morello. “These electric dipoles interact with each other over fairly large distances, a good fraction of a micron, or 1,000 nanometres.” “This means we can now place the single-atom qubits much further apart than previously thought possible. So there is plenty of space to intersperse the key classical components such as interconnects, control electrodes and readout devices, while retaining the precise atom-like nature of the quantum bit.” “It’s easier to fabricate than atomic-scale devices, but still allows us to place a million qubits on a square millimetre.” What this new flip-flop qubit means is a balance that could make future quantum computers small and potentially affordable. “It’s a brilliant design, and like many such conceptual leaps, it’s amazing no-one had thought of it before,” says Morello. The research has been published in Nature Communications.
<urn:uuid:af053e87-94bd-4b54-b782-6e2c47ea4b59>
CC-MAIN-2021-04
https://grendz.com/pin/5575/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703497681.4/warc/CC-MAIN-20210115224908-20210116014908-00053.warc.gz
en
0.917404
1,064
3.921875
4
Machines enrich and enhance our lives, whether it’s the smartphones that allow us to stay connected or the supercomputers that solve our toughest computational problems. Imagine how much more productive and innovative our world will be when computers become infinitely more powerful. Indeed, the growing field of quantum computing may make our current technological capacities look feeble and primitive in comparison. It could even transform the workings of the human brain and revolutionize how we think in ways we can’t begin to imagine. Today, computers operate at the most basic level by manipulating two states: a zero or a one. In contrast, quantum computers are not limited to two states, but can encode information in multiple states that exist in superposition, also known as quantum bits or qubits. In other words, this technology takes advantage of one of the most fascinating properties of the quantum world: the ability of subatomic particles to exist in more than one state at any given time. Consequently, a quantum computer can perform many calculations at the same time, whereas a traditional Turing machine can only perform a single calculation at once. Such quantum machines will be millions of times more powerful than our most powerful current computers. The revolutionary implications of such a computing capacity are immense and will contribute to the acceleration of human thinking and progress. Quantum computers can allow us to push the limits of artificial intelligence, derive ground-breaking insight from big data, advance cryptography, develop new materials, and even simulate virtual quantum systems like never before. According to Greg Tallant, Lockheed Martin fellow at the USC center, “The technology could be harnessed to speed the debugging of millions of lines of code or help solve the aerospace industry’s toughest computational problems.” Many techno-optimists believe that quantum computers will allow us to expand our understanding and capabilities by giving us access to machines that think in ways we never have in multiple states and dimensions at once. But what if it wasn’t just our machines that were unfathomably more intelligent? What if we were too? Imagine being able to ponder multiple ideas at the same time, solve several mathematical equations at once, or have a conversation with more than one person. Imagine what kind of world we would live in if every person could exist in a multitude of mental states. One wild application of quantum computers may be to design them as a carrier for our minds. While this sounds like science fiction, we are seeing exponential growth in brain scanning and mapping technologies, along with neural engineering, which will all contribute to our ability to model the brain and develop technologies that can digitally replace some or all of its functions. Hundreds of millions of dollars are being invested in brain-computer interfaces and implants. Neuroscientist Kenneth Hayworth writes in Skeptic magazine, “All of today’s neuroscience models are fundamentally computational by nature, supporting the theoretical possibility of mind-uploading”. Couple that with the rapidly growing advancements in quantum computing and the futuristic hope that we will one day upload our minds into machines. After all, the laws of physics don’t limit that possibility. There are plenty of technical, scientific, and even philosophical questions yet to be answered. Even with the capability, some wonder whether our sense of self would go along for the ride, or if we would have effectively created a copy of ourselves. And perhaps instead, we may simply link more closely with computers via brain-machine interfaces. In either case, we will undoubtedly find our capabilities enhanced. And perhaps an even more mind-blowing aspect of quantum computing is the idea put forward by Oxford University quantum physicist, David Deutsch, who suggests that quantum computers function by distributing parallel work across many different universes. These new machines could be humanity’s first baby steps towards harnessing the computational power of a multiverse. Will we get there? Some are skeptical about whether quantum computers will be taking over anytime soon. There are many challenges facing experts in the field, such as designing a simple way to control complex systems of qubits. One major obstacle is that qubits are more susceptible to errors, compared to transmitters in classical computers. Another one is creating qubits that can maintain their quantum properties for a long period of time, known as the coherence time. Scott Aaronson, professor at The University of Texas, Austin has listed the main challenges to quantum computing theory. But we are seeing progress in the field. Tech giants like Google and Amazon are racing to develop their quantum computing technologies. Last year, teams of Google and NASA scientists showed a D-Wave quantum computer was 100 million times faster than a conventional computer in a test and successfully simulated a hydrogen molecule with it. This was a development that, according Google quantum software engineer Ryan Babbush, could allow us to “simulate even larger chemical systems” and “revolutionize the design of solar cells, industrial catalysts, batteries, flexible electronics, medicines, materials and more.” Many also argue quantum computing will be a successor to Moore’s Law, which states that the number of transistors on a microprocessor doubles every 18 months. Moore’s law has been going steady since the 1970s, but traditional chips are approaching natural limits. What will follow? There are a number of competing technologies in the wings, but quantum computing is certainly a leading candidate in a number of powerful applications. Regardless of whether human consciousness naturally relies on some form of quantum phenomena (many speculate that it does but we do not know for a fact), there is no doubt such a measure will push the boundaries of the human mind beyond its natural capabilities. Image credit: Shutterstock
<urn:uuid:4abc3e9e-4d3e-4caa-ae19-0bbffe95b661>
CC-MAIN-2021-04
https://singularityhub.com/2016/10/02/this-is-your-brain-on-quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703514796.13/warc/CC-MAIN-20210118123320-20210118153320-00053.warc.gz
en
0.933486
1,154
3.890625
4
To mathematicians and those interested in the science of encryption, the job of a cryptographer is an interesting one. Basically, cryptographers work on implementing encryption. This definition from Career Explorer says it well: “A cryptographer is someone who develops algorithms, ciphers and security systems to encrypt sensitive information and provide privacy for people and corporations.” (Read Encryption Vs. Decryption: What's the Difference?) First, let's look at some of the basic things that cryptographers might be involved in. (Read Cryptography: Understanding Its Not-So-Secret Importance to Your Business.) One such activity is the implementation of hash functions. As we've reported before, hash cryptography involves linking the contents of some data structure to a shorter key that shows whether or not data has been changed or tampered with. The key that is ‘hashed’ from the data set is the encryption. This technique is used quite a bit in the cryptography world, and companies hiring these professionals will often ask about their expertise with hash functions. Elliptic Curve Cryptography Here's another part of what cryptographers may be involved in — a concept called “elliptic curve cryptography” uses the algebraic structure of elliptic curves to create public key cryptography results that are useful for digital signatures and in other parts of the encryption world — (it seems likely that this cryptography intern job ad actually spelled the designation wrong). So what else does the job of a cryptographer look like? Here's what we found out from some professionals in the field. From Book Ciphers to Mathematical and Algorithmic Encryption Part of understanding what a modern cryptographer does involves contrasting today's cryptography with the disciplines that came before it. In the old days, cryptographers used simple ciphers to encode messages, for example, a letter shift that simply made each letter of the alphabet into another, or alternately, into a particular rune or symbol. These encryptions, by modern standards, are laughably easy to decode. A few years ago, something called PGP or Pretty Good Privacy was a gold standard — these types of cryptography are much more elaborate and resistant to cracking than the old ciphers. “Encryption has come a long way from simply moving each letter over a few places in the alphabet,” said Shayne Sherman, CEO of TechLoris. “Creating these complicated and highly secure algorithms is one job of a cryptographer. Another is analyzing encrypted data for law enforcement and military organizations to attempt to break certain encryption algorithms.” Both Builders and Breakers Dr. Yehuda Lindell, CEO and co-founder of Unbound Tech, said it well in responding to our questions about cryptography. “In some areas, the cryptanalysts are the ones with the best understanding of how to build secure schemes as well, and so that they are both builders and breakers,” Lindell said. “Primarily, this is in the area of symmetric cryptography: stream ciphers, block ciphers, hash functions, and the like. However, in the area of asymmetric (public-key) cryptography, schemes are typically based on hard problems from number-theory and algebra. As in the symmetric world, these researchers are also the most qualified to propose new hard problems. However, their skill set is usually completely different from those doing symmetric cryptanalysis. Those working in the asymmetric setting typically have very deep math background. Having said that, I would argue that almost all cryptographers are pretty good at math.” Lindell’s co-founder Nigel Smart expounded on this idea: “One can sub-divide cryptographers into those that work on breaking schemes, those that work on creating symmetric key schemes; those that work on public key schemes; those that work on basic protocols like key agreement; those that work on more advanced protocols like MPC; those that work on efficient implementations; those that work on secure implementations which avoid side-channels; those that work on software; and those that work on hardware.” Working on Bitcoin and Other Coins Here's another big application of cryptography in today's fintech industry. “Bitcoin and other decentralized forms of payment depend on the work of cryptographers,” says Anna Tatelman, a consultant for Pelicoin. “Unlike with traditional financial institutions, all Bitcoin transactions are pseudonymous. This means that all personal information such as names, addresses, and social security numbers cannot be accessed even by Bitcoin’s creators. This is thanks to diligent cryptographers who hide all users’ personal data to greatly reduce the potential of both internal and external fraud.” From the above input, and in looking at resources showing what today's cryptographers do, we see that although the job role is pretty clearly defined, there's a diversity of techniques and strategies that cryptographers will use to secure data. Whether it's the Bitcoin in your digital wallet, the big databases that retailers use to keep sensitive customer data, or protected secrets in government networks, cryptographers do the tough job of staying ahead of those who would crack or break the systems to get the sensitive data inside. It's a big job, but one that builds on a long tradition of encoding and decoding, one that’s in some ways intuitive to our human intelligence. Now, we harness the incredible logical power of computers to make encryptions ever stronger — in search of the best protection from hackers and malicious intruders. Cryptography is a game that everyone has to play — and it’s still evolving in the age of quantum computing and AI. (Read Quantum Cryptography Vs. Quantum Hacking: A Cat and Mouse Game.)
<urn:uuid:6a8bda8f-f5f5-4778-840d-c55bcfae413d>
CC-MAIN-2021-04
https://www.techopedia.com/job-role-cryptographer/2/34169
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703511903.11/warc/CC-MAIN-20210117081748-20210117111748-00257.warc.gz
en
0.942525
1,194
3.71875
4
AI is another disruptive technology of our time, and just like its predecessors, it will have a profound impact on our existence. AI is a set of algorithms designed by humans and expressed through machines that can incorporate the five human senses (seeing, smelling, tasting, hearing and feeling) and ability to communicate (speaking). The very senses that have been historically used to magnify the differences between humans and machines are now enabling machines to effectively handle ever more qualitative operations and analysis. Recent advancements in machine learning, deep learning and quantum computing has increasingly swelled the desire and ability for people to automate processes. AI and robotic applications enable people to create a new world of accelerated process automation that we could not have achieved without the continued focus on improving intelligent systems and machines. The History Of Human And AI Cooperation Humans and machines have coexisted on a large scale since the Industrial Revolution in the late 1700s. It commenced a period of profound technological achievements that brought on massive social and economic change spanning almost every conceivable industry in the world — from textiles and transportation to printing and consumer goods to healthcare, schools and governments. It was signified by machines designed by humans to replace human effort and automate repetitive, time-consuming tasks — allowing humankind the freedom to be more creative and use our imaginations to explore new ideas. More importantly, it gave us the ability to improve the quality and productivity of our work and lives and evolve as people. It was also a time of massive disruption, uncertainty and fear. And, conceivably, it marked the time when humans and machines began to coexist and develop an interdependent relationship. And while some existing jobs were eliminated and replaced by machines, new types of jobs, job functions and business models were created. This symbiotic relationship has been playing out with every major advancement in technology, further deepening the human-machine interdependent relationship. From the transistor that introduced smaller, less expensive computers and computer processing in the late 1940s to the internet in the 1970s to the Fourth Industrial Revolution currently underway, people continue to rely on machines to solve human problems and automate human tasks. While machine learning or neural networks have been in development since the 1950s by great minds such as Alan Turing and Marvin Minsky, what has changed is advancements in computing performance and data storage that allow us to capture and retain significant amounts of data that can be used to build AI applications. Now deep learning applications and neural networks can mirror and mimic the human brain and, in some cases, outperform our ability to solve problems and take action. So, how can we rely on a machine that we built if it can surpass our ability to solve problems and take action? And if a machine can act like us, then what differentiates person from machine? For the human race to continue its evolutionary path, the cooperative relationship between person and machine must also remain strong. AI Is Not Replacing Us, It’s Improving Us The World Economic Forum (WEF) predicts that 75 million jobs will be lost to this era of smart automation. It also estimates that 133 million new jobs will be created. These new jobs require new job skills that are intended to leverage and improve AI and its applications. This change will magnify the persistent need for human and AI cooperation and cement our interdependent relationship. AI- and robot-assisted machines are being deployed worldwide at an unprecedented pace and they’re radically improving outcomes across industries. One illuminating example of AI-assisted applications becoming better with human and AI cooperation is the virtual assistant Alexa and the smart home “connected devices” that followed — aiding humans to more efficiently control and operate lighting, security, room temperature and appliances. There are countless other examples of AI- and robot-assisted functions and applications that are improving outcomes across industries. From health care using AI assistance in the operating room to improve surgical and patient outcomes to computer-assisted instruction in education to self-driving cars in transportation to the Department of Defense (DoD) with DARPA’s AI Next campaign, it’s clear that AI is capable of dramatically enhancing our lives and our value as humans. In order for companies to realize benefits from human and AI collaboration, all stakeholders throughout the development supply chain must be involved — from the academics who are advancing AI theory, to data scientists who are applying these theories in industry to design models that solve business problems, to system integrators who are deploying those models into production environments. Through building systems based on open architecture and making a customer’s data accessible to actionable, we in the industry can go a long way in building proof that increasing human and machine cooperation benefits companies in many areas, from improving business processes to fostering new skills in employees. The key is to focus on both human and human and machine collaboration throughout the solution design practice to ensure that organizations are able to maximize the value of AI. Process automation is essentially helping us improve our lives and further define what it means to be human. It frees us from mundane, repetitive tasks and empowers us to challenge our human capabilities and focus on what we do best: imagine, improve, innovate and evolve. Take your company to the next level. Keep your communication with your customers and employees strong, personal and most importantly instant. You can easily do this using the hi.guru conversational AI platform and ultimately enhance every communication experience by using responsive chatbots and other tools. It all starts by consolidating your existing communication channels into one and ensuring a better response time. Read more about our instant messaging platform and about creating a chatbot that’s just right for you.
<urn:uuid:d7904ddd-5405-4315-9e68-4d3f572a13d0>
CC-MAIN-2021-04
https://hi.guru/ai-technology-advancement-cooperation-is-the-key-initiative/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703524858.74/warc/CC-MAIN-20210121132407-20210121162407-00658.warc.gz
en
0.947092
1,142
3.859375
4
Scientists pinpoint the singularity for quantum computers Researchers from the University of Bristol have discovered that super-powerful quantum computers, which scientists and engineers across the world are racing to build, need to be even more powerful than previously thought before they can beat today's ordinary PCs. Quantum computers are a new type of machine that operate on quantum mechanical hardware and are predicted to give enormous speed advantages in solving certain problems. Research groups at leading universities and companies, including Google, Microsoft and IBM, are part of a worldwide race to realise the first quantum computer that crosses into the 'quantum computational singularity'. This represents a problem so complex that today's top supercomputer would take centuries to find a solution, while a quantum computer could crack it in minutes. Now a team of scientists from Bristol have discovered that the boundary to this singularity is further away than previously thought. The research is reported this week in Nature Physics. The results apply to a highly influential quantum algorithm known as 'boson sampling', which was devised as a very direct route to demonstrate quantum computing's supremacy over classical machines. The boson sampling problem is designed to be solved by photons (particles of light) controlled in optical chips – technology pioneered by Bristol's Quantum Engineering and Technology Labs (QETLabs). Predicting the pattern of many photons emerging from a large optical chip is related to an extremely hard random matrix calculation. With the rapid progress in quantum technologies, it appeared as though a boson sampling experiment that crossed into the quantum computational singularity was within reach. However, the Bristol team were able to redesign an old classical algorithm to simulate boson sampling, with dramatic consequences. Dr Anthony Laing, who heads a group in QETLabs and led this research, said: "It's like tuning up an old propeller aeroplane to go faster than an early jet aircraft. "We're at a moment in history where it is still possible for classical algorithms to outperform the quantum algorithms that we expect to ultimately be supersonic. "But demonstrating such a feat meant assembling a crack team of scientists, mathematicians, and programmers." Classical algorithms expert Dr Raphaël Clifford, from Bristol's Department of Computer Science, redesigned several classical algorithms to attack the boson sampling problem, with the 1950's Metropolised Independence Sampling algorithm giving the best performance. The simulation code was optimised by QETLabs researcher 'EJ', a former LucasArts programmer. Expertise on computational complexity came from Dr Ashley Montanaro, of Bristol's School of Mathematics, while QETLabs students Chris Sparrow and Patrick Birchall worked out the projected performance of the competing quantum photonics technology. At the heart of the project and bringing all these strands together was QETLabs PhD student and first author on the paper, Alex Neville, who tested, implemented, compared, and analysed, all of the algorithms. He said: "The largest boson sampling experiment reported so far is for five photons. "It was believed that 30 or even 20 photons would be enough to demonstrate quantum computational supremacy." Yet he was able to simulate boson sampling for 20 photons on his own laptop, and increased the simulation size to 30 photons by using departmental servers. Alex added: "With access to today's most powerful supercomputer, we could simulate boson sampling with 50 photons." The research builds on Bristol's reputation as a centre of activity for quantum science and the development of quantum technologies. Through QETLabs, the university has embarked on an ambitious programme to bring quantum technologies out of the laboratory and engineer them in to useful devices that have real-world applications for tackling some of society's toughest problems. In addition to collaborations with tech companies such as Microsoft, Google, and Nokia, start-ups and new business activities focused on quantum technologies have emerged in Bristol. An important theme across the overall quantum research activity is developing our understanding of exactly how quantum technologies can provably outperform conventional computers. Recently Dr Montanaro, together with Professor Noah Linden of the School of Mathematics, organised a Heilbronn Focused Research Group on the topic of quantum computational supremacy. This meeting brought some of the world leaders in the field, from both industry and academia, to Bristol for a week of intense discussions and collaboration. Among the attendees was one of the theorists who devised boson sampling, Professor Scott Aaronson, from UT Austin. Although outperforming classical computers might take a little longer than originally hoped, Dr Laing is still optimistic about the prospects for building a device to do just that. He said: "We now have a solid idea of the technological challenge we must meet to demonstrate that quantum machines can out-compute their classical counterparts. For boson sampling, the singularity lies just beyond 50 photons. It's a tougher nut to crack than we first thought, but we still fancy our chances." With Dr Laing's group focused on practical applications of quantum technologies, the current work puts bounds on the size and sophistication of photonic devices that will be required to tackle industrially relevant problems that are beyond the capabilities of today's classical algorithms.
<urn:uuid:f712016e-9771-4a43-99ce-43d7eb15f54f>
CC-MAIN-2021-04
https://phys.org/news/2017-10-scientists-singularity-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703527850.55/warc/CC-MAIN-20210121194330-20210121224330-00060.warc.gz
en
0.937086
1,067
3.75
4
A recent breakthrough in the research revealed the process of making the wonder material graphene from the trash. Graphene is cool stuff. The single-atom-thick layer of carbon has a number of properties that make it almost endlessly useful. Because of all the neat tricks, it can do, it’s popularly dubbed a “wonder material” But over a decade and a half, after it was first isolated, the only thing I’m wondering is: where it is? Turns out the stuff is really hard to make in useful quantities, but a recent breakthrough from researchers at Rice University promises to make large amounts of graphene in a flash from your trash. Graphene looks like this. Anyway, it’s not much to look at—it kind of resembles chicken wire. But this honeycomb lattice of carbon can do some amazing things. It is one of the thinnest, strongest, and most conductive materials we have ever discovered. This strength can be used to reinforce other materials. Its amazing conductivity could help us make energy-dense batteries or efficient heat sinks. Its flexibility could make wearable electronics and bendable displays. Ironic, given that it was first extracted by adding a piece of sticky tape as you might have in your home to a block of graphite and peeling it off, then re-sticking and peeling the tape off until you have thin flakes left behind. It’s like it’s taunting us. But there’s a reason we don’t have armies of people just peeling tape apart. The graphene produced by this technique is still a few layers thick and we are after that single-atom-thick goodness. As of right now, the prevailing methods to achieve that usually involve assembling it on sheets of copper, than using plastics and chemicals to get it off. But the process is not environment-friendly and it’s slow and expensive. A piece of 60mm x 40 mm monolayer graphene on copper will cost you about $172. But what if we’re overthinking this? What if we could just take any old carbon source and zap it to make graphene? As far as I can tell, that’s basically the line of thinking the researchers from Rice University followed. The method they have developed involves charging high-voltage capacitors with electricity, then releasing them all at once into almost any substance containing carbon. The current passes through the target material, heating it to over 3,000 Kelvin and breaking every carbon-to-carbon bond in the process. The non-carbon elements sublime out, while the carbon atoms rearrange themselves as graphene. Also Read: What is Neuromorphic Computing? Excess energy is dispersed as light, so researchers dubbed the product “flash graphene.” The change can take as little as ten milliseconds. Not only does this produce a gram of graphene quickly and cheaply; it also makes a particular kind of graphene called turbostratic graphene. Unlike A-B stacked graphene, which has orderly layers that are hard to pry apart, the layers of turbostratic graphene have no ordered alignment. This means they can be easily separated using solvents or inside composite materials. Now, this process doesn’t make large sheets of graphene, just small flakes. So, it may not be the breakthrough that leads to flexible screens you can put on a T-shirt. But it still has some very useful—albeit less flashy—applications. The researchers envision flash graphene being added to concrete and estimate that just a fraction of a percent of graphene added in could boost cement’s strength by 35%. That translates to less building material needed, saving costs and lessening the environmental impact. Also Read: The need for Quantum Computing Flash graphene could be an ecological double win because it can be made with recycled plastic or food waste, or it could be an alternative use for cheap coal that doesn’t involve burning it and releasing CO2. The Department of Energy thinks turning coal into graphene looks promising and is funding the research with the goal of producing a kilogram of flash graphene a day within two years. We are all clamoring for graphene to take the world by storm, but the reality is that it’ll take incremental steps like this to bring this wonder material into our daily lives. It’s already showing up in places that are hard to spot, like inside headphones and the coating of motorcycle helmets. Now thanks to this new work, it may soon show up in our buildings, too. And the only way you might be able to tell is if you measured the thickness of the walls or noticed there was suddenly a lot less plastic and banana peels lying around. One of the lead researchers from Rice, James Tour, started experimenting with making graphene out of odd sources because of a bet in 2011 when a colleague challenged him to make it out of, among other things, cockroaches and dog poop.
<urn:uuid:7d52caf7-a0ee-466d-bf50-f3cc50b4d545>
CC-MAIN-2021-04
https://www.thenotitia.com/the-wonder-material-graphene-from-trash/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703517966.39/warc/CC-MAIN-20210119042046-20210119072046-00059.warc.gz
en
0.947611
1,037
3.984375
4
Chinese physicists say they have built a quantum computer one trillion times faster than the most powerful supercomputer, with potential for some real-life applications. The researchers said that using a process called “Gaussian boson sampling”, their Jiuzhang prototype quantum computer took a little over three minutes to complete a task that the world’s fastest conventional machine would not be able to complete in 600 million years. This achievement firmly established our country’s leading position in international quantum computing research,” a team of researchers led by Professor Pan Jianwei said in a statement introducing a paper published on the website of the Science magazine on Friday. Quantum computers rely on some counter-intuitive physics of the subatomic world, and are extremely fragile and difficult to maintain. However, conventional computers struggle to cope with problems that involve uncertainty, such as predicting the rise and fall of the stock market, simulating the vibration of some elusive atoms, tracing the origin of a new-found virus, or guessing a bank account password. The Jiuzhang was built to find clues in this kind of chaos. For instance, a database may contain many smaller data sets, some of which could have an unknown relation to the other. The Jiuzhang could quickly find out which data sets were related, a daunting task to traditional computers if the database contained a large amount of random information. This unique calculation capability has a wide range of potential applications such as data mining, bioinformatics and finance, according to the researchers. In the test reported in Friday’s paper, the Jiuzhang used light particles called photons to perform calculations. The photons must be generated in their purest possible form, because even a small physical discrepancy could lead to errors. And they must be produced one after another, a technical challenge that pushes optical precision to the limit. “It is easy for us to have one sip of water each time, but it is difficult to drink just a water molecule each time,” said Pan, a lead scientist in China’s national quantum research programme with the University of Science and Technology of China in Hefei, Anhui province. Though small in size, the Jiuzhang could be one of the most complex optical instruments ever built, with 25 crystals, each tailor-made and maintained at precise temperature, to manipulate the photons and simulate real-life chaos. To obtain accurate results, Pan’s team also developed the world’s most sensitive light detectors. But how could the results be verified? If the machine made a mistake, Pan’s team reasoned, it could be detected by indirect measures such as abnormal spikes of temperature in some critical components, which did not happen. They also tested the results of smaller-scale calculations on Shenwei TaihuLight, the fastest supercomputer in China. One of those tests consumed US$400,000 worth of computer time, according to Scott Aaronson, a peer-reviewer of Pan’s paper. “This was by far the most expensive referee report I ever wrote,” he said. Aaronson, a computer science professor with the University of Texas, Austin, came up with the original idea of a light-based quantum computer. He told the South China Morning Post that he did not expect the pace of development to be so fast. When he proposed the idea, some physicists said it would never work. Even Aaronson once thought the design would remain on paper forever. The Jiuzhang is not the first quantum computer to appear to outperform a traditional computer. Google announced last year that Sycamore, a similar machine, could do a task in 200 seconds that would take 10,000 years on a supercomputer. But researchers from IBM quickly showed that the same task could be done on a traditional computer in two days with a new algorithm. And Sycamore made a lot of mistakes due to the instability of its operation. The Jiuzhang, named after a 2,000-year-old Chinese maths text, is China’s answer to the sceptics on quantum computer technology. It does not need to work sealed in extremely low temperatures like some other quantum computers and can operate in a stable manner for longer. And, in the boson test, it was 10 billion times faster than Sycamore, securing a huge advantage in performance highly unlikely to be challenged by a traditional computer, according to some physicists not involved in the study. China and the US have engaged in a heated race in quantum technology. China launched the world’s first quantum satellite and built the longest quantum communication network, but seemed losing to the Americans on the computer front. Critics at home also say quantum technology consumes too muchtaxpayers’ money and produces too little of practical value. The Jiuzhang cannot be used immediately in real-life applications. It will need to work with a programmable chip to perform various calculations. And it cannot solve the factoring problem that is crucial to decoding encrypted information, so bank accounts are still safe, according to a quantum researcher not involved in the study. “You cannot use this as an excuse to spend all your savings,” he said. God, please protect me from my friends. I can handle my enemies.
<urn:uuid:bef422aa-d9b6-4091-886f-9d33b6dfa65f>
CC-MAIN-2021-04
https://hksar.org/china-claims-quantum-computing-lead-with-jiuzhang-photon-test
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703565376.63/warc/CC-MAIN-20210125061144-20210125091144-00261.warc.gz
en
0.948902
1,096
3.71875
4
- Quantum Supremacy - definition - Encryptions remain uncrackable so far - Peter Shor's Algorithm - Final words Google researchers completed an experiment that demonstrates the first computation that you can perform only with a quantum processor. It is something scientists call “quantum supremacy.” The quantum computer can perform tasks that no conventional processor can within a reasonable period. If achieved, quantum supremacy would eventually change the world of cryptography forever. Deciphering a code would only take seconds or minutes instead of long years. As a result, the current cyber encryption methods require an update. Otherwise, they won’t be able to withstand the immense computing power of a quantum processor. The problem is that scientists and media people think differently. And no, Google’s quantum supremacy experiment will not put an end to the data encryption as we know it, at least not any time soon. What Is Quantum Supremacy? Quantum computing offers much higher speeds compared to regular computers when facing complex calculations. Traditional processors require months, even years, to solve complicated equations and problems. However, quantum computers can find solutions exponentially faster by using qubits. According to the paper that Google published, their quantum processor “takes about 200 seconds to sample one instance of the quantum circuit one million times.” An existing supercomputer that uses classical computer architecture would need 10,000 years to complete a task of such complexity, the paper adds. However, IBM was quick to rebuff Google’s claims that it reached quantum supremacy. On October 21, the company announced that its supercomputer “Summit” managed to complete the task in two and a half days. Therefore, the race is still on. “We argue that an ideal simulation of the same task can be performed on a classical system in 2.5 days and with far greater fidelity.”IBM’s Edwin Pednault, John Gunnels, and Jay Gambetta What makes a quantum processor work so much faster than a classical one? Traditional computers use zeroes and ones to store data in pieces we call bits. Meanwhile, quantum processors work at the atomic level and use quantum bits instead. We call them qubits, which use zeroes, ones, and any number in between. Thus, you get far greater efficiency when processing data. If you think it is too complicated, think again. IBM released a commercial version of its 14th quantum computer in October. It had 53 qubits of computing power, nearly double the capacity of the earlier one they released a year ago. Did you read any news reports that an evil government uses IBM’s quantum computer to crack codes? I bet you didn’t. Quantum Supremacy Does Not Mean Unsafe Cryptography Codes Quantum computers do not have any practical uses at this stage. Just because they are available for commercial practices does not mean you can use them for instant budgeting, for example. And they cannot crack cryptography codes as well. What these computers can do is process large amounts of data in parallel, resulting in markedly shorter processing time. Traditional devices, on the other hand, process data sequentially. Classical machines have been successful in reproducing the performance of quantum computers of up to 40 qubits, until recently. Google’s Sycamore processor is using 53 qubits. But further developments in quantum computing will require building a physical quantum computer. Both Google and IBM still haven’t been able to do that. And I doubt there is one hidden somewhere in a secret R&D laboratory. But if such a computer is available, how could it break a code that would take current classical machines tens of thousands of years to crack? Why Shor’s Algorithm Matters RSA, or Rivest-Shamir-Adleman, is an asymmetric encryption algorithm and a standard cryptographic technique to encrypt data on the Internet. It uses two different keys: public for encryption and private for decryption. RSA users can create and publish a public key based on the multiplication of two large prime numbers. And anyone can use this key, since it is open, to encrypt a message. However, the prime numbers must remain secret. Otherwise, anyone can decode the information. That’s because the prime numbers can be used as a private key to decrypt the message. Current computers will need many years to break a 2048-bit key. If you use a 1024-bit key, anyone with a sizeable budget backing can crack it within a year. And that is where Shor’s algorithm enters the equation. American mathematician Peter Shor offered a quantum computer algorithm for integer factorization back in 1994. It solves the following problem that concerns cryptography: Find the prime factors of any given integer. A capable quantum computer can, in theory, crack all our current encrypted communications in no time using Shor’s algorithm. But we are still a long way from witnessing such a computer in action. It appears that “Sycamore” demonstrates quantum supremacy within a narrow sampling task. Putting Shor’s algorithm to work requires much more. Quantum Supremacy – Parting Words Quantum computing is a viable technology. However, we are not yet sure whether it can do something that a conventional processor cannot. Furthermore, quantum computers are unstable, which hinders your ability to use them for practical purposes as of today. They also need to store data for long periods to process it faster than conventional machines. But the process consumes lots of energy, which changes the state of qubits, leading to the destruction of saved information. Everything having the word “quantum” in its name is much more complicated than it appears if you read about it in popular magazines. What it means for your cybersecurity is that current encryption methods are quite safe; for now. Do you think scientists are on the verge of reaching quantum supremacy? Or do you believe that we are still a long way to go? Tell us what you think in the comment section below.
<urn:uuid:681e3108-3849-4bf9-afa9-c37f7789da04>
CC-MAIN-2021-04
https://anonymania.com/can-quantum-computer-break-encryption/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703521139.30/warc/CC-MAIN-20210120151257-20210120181257-00258.warc.gz
en
0.929039
1,260
3.53125
4
News and videos about quantum computers (QC) are common. ‘Quantum’ inspires awe and mystery. Astonishing speed-ups are promised. ‘Entanglement’ is thrown in the mix - people become hooked. But this computer research that inspires such fascination is an area that offers the fewest opportunities for involvement or understanding. Want to learn to programme? Use tools like Scratch. Want to develop machine learning skills? There’s a Python package for that. Want to learn about QC? Zip through these courses on complex vector spaces, number theory, and an undergraduate introduction to quantum mechanics. Then you can start trying to understand the basics of QC! But what about the only ‘killer app’ for QC - Shor’s Algorithm? Well, that would strain the brain of a third-year maths undergraduate. The mysteries of quantum effects are easy to understand in the maths. In the equations all is clear. But it is a mix of maths topics unusual to find in the average computer programmer. Another approach to understanding QC involves helping other people understand it. One way to do this is to create musical problems and use QC to solve them. Discussing the solution to these problems can provide a greater insight into QC. The example in this article is the musical problem of chords, solved on a quantum D-Wave 2X. The first company to sell quantum computers was D-Wave, who flogged a few off to people such as Google, NASA and Lockheed Martin. The D-Wave computers are adiabatic quantum computers (ADC). They are not like normal algorithmic step-by-step QC, such as those made by IBM. An adiabatic quantum computer is reminiscent of a neural network. It is based on the equations for Ising models. Ising models describe the physics of a magnetic material through the molecules within it. An ADC solves the equations of the Ising model to minimise the energy in the simulated magnetic material. The programming involves defining properties of the simulated ‘molecules’. Over a period of 28 years, more than 10,000 publications came out in areas as wide as zoology and artificial intelligence on the applications of the Ising. There is an ongoing debate about how the D-Wave ADC truly functions and what speedup it can provide. Google claimed large speed increases for its quantum hardware. This is thought to be due to quantum tunnelling. When searching for low energy states, a quantum system can tunnel into nearby states. Quantum tunnelling allows physical systems to move to states in ways that would not be possible in the classical Newtonian view of the world. The systems ‘tunnel’ through to the new, usually inaccessible states instantaneously. This particular musical problem was set up by assigning each note of the musical scale to one ‘molecule’ of the Ising model. Each molecule is modelled by a quantum bit, or Qubit. At this point, the mathematical world of quantum mechanics is entered, where everything makes sense in the equations, but not in the explanation! Every qubit can be simultaneously a one or zero (unlike a bit which can only be one or zero). This is very simple mathematically, but makes no sense in our everyday observed world. For example, a cat cannot be both alive and dead, as Schrodinger once observed in his famous thought experiment. He was trying to imagine the laws of quantum mechanics applying to the world beyond subatomic particles. This, so called, ‘superposition’ of one and zero is not a form of statistical or probabilistic computing. It is something more complex. In the combination of one and zero held by this single qubit, the one and the zero also have what is known as a ‘phase’. This can be thought of as the result of another strange consequence of quantum theory: everything is simultaneously a wave and a particle. An electron is a waveform, and a light wave is also a particle of light called a photon. When the qubit is actually measured, its resulting value will always be 0 or 1. For definite. What’s more, the phase of the 0 and 1 in the superposition has no effect on the chance of whether a 0 or 1 is seen. But, until that observation, not only is the result indeterminate, but these phases have dramatic effects on how qubits interact. Things have clearly moved beyond the realms of how programming is normally thought about. The qubit being like a bit that is both 0 and 1 is a useful analogy, but it’s incomplete. Qubits in harmony The D-Wave 2X dealt with many underlying complexities. Connections were set up between the ‘molecules’ (the musical notes) in such a way that when the D-Wave program was triggered, it generated the form of musical chord required. A simple musical rule is used. The D-Wave would be sent a note, and it would find three or four notes which included this note, and which were not too close together nor far apart on the piano keyboard. Try pressing three notes at the same time on the piano keyboard. If they are too close they clash, if they are too far apart they don’t sound like a chord. Each time the D-Wave was asked to harmonise a note using this algorithm, it would send me multiple possible solutions. This highlights a key element of QC - there is no single correct solution to an algorithm. The solutions are held in a superposition, and then when observed, a single solution presents itself. This is not necessarily the precise process the D-Wave is following, but its qubits move through a number of superpositions as a solution form. These ideas were captured and explained in a performance at the Port Eliot Music Festival in July 2017 called ‘Superposition’. It was a composition for mezzo soprano (Juliette Pochin) and electronic sounds. The electronics were generated by a real-time music system on my laptop, connected over the internet to the D-Wave 2X at USC. The mezzo-soprano’s music was pre-composed. The sounds of her voice were picked up live by the laptop, converted into energy and frequency readings, and sent to the D-Wave as a problem to be solved by the harmony generator. The D-Wave returned multiple solutions. The local laptop took the multiple chords, spread them across the musical range, and played them together. These giant chords gave the audience some sense of the multiple solutions that may have existed in the superposition inside the quantum computer. Universal quantum computers The next performance planned will involve the Universal QC (UQC) of IBM. UQC have logic gate diagrams and assembly code. They have processing elements, like NOT, XOR and a form of AND gate. But… the analogy breaks down. There are also gates that change qubit phase. The ‘Hadamard’ gate that takes as input a qubit that is definitely a 1 or 0, and turns it into an indeterminate superposition. Combine a Hadamard gate with a quantum XOR gate and you have ‘entangled’ qubits. Entanglement, vital to QC algorithms and probably the most famous element of QC, is once again simple to see in the maths, but makes little sense if explained otherwise. Quantum computing, both adiabatic and universal, is proving a fruitful research topic. What is lagging is true public engagement. People, and most programmers, don’t know degree-level maths. So, let’s find new approaches to explain, and perhaps one day utilise, the power of quantum computing in more comprehensible ways. Information on Alexis Kirke’s work and further projects can be found at: www.alexiskirke.com
<urn:uuid:7444686a-1128-4d3c-8b8d-3230b5fe516e>
CC-MAIN-2021-04
https://www.bcs.org/content-hub/experiencing-quantum-through-music/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704821381.83/warc/CC-MAIN-20210127090152-20210127120152-00462.warc.gz
en
0.948361
1,654
3.78125
4
As mysterious as the Italian scientist for which it is named, the Majorana particle is one of the most compelling quests in physics. Its fame stems from its strange properties – it is the only particle that is its own antiparticle – and from its potential to be harnessed for future quantum computing. By Catherine Zandonella, Office of the Dean for Research First Posted on June 13, 2019 to the Discovery blog of the Office of the Dean for Research In recent years, a handful of groups including a team at Princeton University have reported finding the Majorana in various materials, but the challenge is how to manipulate it for quantum computation. In a new study published this week, the Princeton team reports a way to control Majorana quasiparticles in a setting that also makes them more robust. The setting – which combines a superconductor and an exotic material called a topological insulator – makes Majoranas especially resilient against destruction by heat or vibrations from the outside environment. What is more, the team demonstrated a way to turn on or off the Majorana using small magnets integrated into the device. The report appeared online in the journal Science. “With this new study we now have a new way to engineer Majorana quasiparticles in materials,” said Ali Yazdani, Class of 1909 Professor of Physics and senior author on the study. “We can verify their existence by imaging them and we can characterize their predicted properties.” “The new platform combines the edge states of a newly discovered type of topology, the higher order topological insulator with magnetism, to give the largest gap one-dimensional topological superconductor with Majorana modes,” said B. Andrei Bernevig, a study co-author and professor of physics at Princeton. “We are entering a new age of material design, where three-dimensional bulk topological materials can be combined with magnetic islands, domain walls and step edges to engineer structures whose properties can fulfill the most exotic requirements of quantum mechanics.” The Majorana is named for physicist Ettore Majorana, who predicted the existence of the particle in 1937 just a year before mysteriously disappearing during a ferry trip off the Italian coast. Building on the same logic with which physicist Paul Dirac predicted in 1928 that the electron must have an antiparticle, later identified as the positron, Majorana theorized the existence of a particle that is its own antiparticle. Typically when matter and antimatter come together, they annihilate each other in a violent release of energy, but the Majoranas, when they appear as pairs each at either end of specially designed wires, can be relatively stable and interact weakly with their environment. The pairs enable the storing of quantum information at two distinct locations, making them relatively robust against disturbance because to change the quantum state requires operations at both ends of the wire at the same time. This capability has captivated technologists who envision a way to make quantum bits – the units of quantum computing – that are more robust than current approaches. Quantum systems are prized for their potential to tackle problems impossible to solve with today’s computers, but they require maintaining a fragile state called superposition, which if disrupted, can result in system failures. A Majorana-based quantum computer would store information in pairs of particles and perform computation by braiding them around each other. The results of computation would be determined by annihilation of Majoranas with each other, which can result in either the appearance of an electron (detected by its charge) or nothing, depending on how the pair of Majoranas have been braided. The probabilistic outcome of the Majorana pair annihilation underlies its use for quantum computation. The challenge is how to create and easily control Majoranas. One of the places they can exist is at the ends of a single-atom-thick chain of magnetic atoms on a superconducting bed. In 2014, reporting in Science, Yazdani and collaborators used a scanning tunneling microscope (STM), in which a tip is dragged over atoms to reveal the presence of quasiparticles, to find Majoranas at both ends of a chain of iron atoms resting on the surface of a superconductor. The team went on to detect the Majorana’s quantum “spin,” a property shared by electrons and other subatomic particles. In a report published in Science in 2017, the team stated that the Majorana’s spin property is a unique signal with which to determine that a detected quasiparticle is indeed a Majorana. In this latest study, the team explored another predicted place for finding Majoranas: in the channel that forms at the edge of a topological insulator when it is placed in contact with a superconductor. Superconductors are materials in which electrons can travel without resistance, and topological insulators are materials in which electrons flow only along the edges. The theory predicts that Majorana quasiparticles can form at the edge of a thin sheet of topological insulator that comes in contact with a block of superconducting material. The proximity of the superconductor coaxes electrons to flow without resistance along the topological insulator edge, which is so thin that it can be thought of as a wire. Since Majoranas form at the end of wires, it should be possible to make them appear by cutting the wire. “It was a prediction, and it was just sitting there all these years,” said Yazdani. “We decided to explore how one could actually make this structure because of its potential to make Majoranas that would be more robust to material imperfections and temperature.” The team built the structure by evaporating a thin sheet of bismuth topological insulator atop a block of niobium superconductor. They placed nanometer-sized magnetic memory bits on the structure to provide a magnetic field, which derails the flow of electrons, producing the same effect as cutting the wire. They used STM to visualize the structure. When using their microscope to hunt for the Majorana, however, the researchers were at first perplexed by what they saw. Some of the time they saw the Majorana appear, and other times they could not find it. After further exploration they realized that the Majorana only appears when the small magnets are magnetized in the direction parallel to the direction of electron flow along the channel. “When we began to characterize the small magnets, we realized they are the control parameter,” said Yazdani. “The way the magnetization of the bit is oriented determines whether the Majorana appears or not. It is an on-off switch.” The team reported that the Majorana quasiparticle that forms in this system is quite robust because it occurs at energies that are distinct from the other quasiparticles that can exist in the system. The robustness also stems from its formation in a topological-edge mode, which is inherently resistant to disruption. Topological materials derive their name from the branch of mathematics that describes how objects can be deformed by stretching or bending. Electrons flowing in a topological material thus will continue moving around any dents or imperfections. The study, “Observation of a Majorana zero mode in a topologically protected edge channel,” by Berthold Jäck, Yonglong Xie, Jian Li, Sangjun Jeon, B. Andrei Bernevig and Ali Yazdani, was published online in the journal Science on June 13, 2019. DOI http://dx.doi.org/10.1126/science.aax1444 Funding was provided by the Gordon and Betty Moore Foundation as part of EPiQS initiative (GBMF4530), the U.S. Office of Naval Research (ONR-N00014-17-1-2784, ONR-N00014-14-1-0330), the National Science Foundation’s NSF-MRSEC programs through the Princeton Center for Complex Materials (DMR-142054, DMR-1608848), the Alexander-von-Humboldt Foundation through a Feodor-Lynen postdoctoral fellowship (BJ). Support was also provided by the U.S. Department of Energy (DE-SC016239, NSF EAGER 1004957, Simons Investigator Grants, U.S. Army Research Office MURI W911NF- 12-1-0461, the David and Lucile Packard Foundation, and Princeton’s Eric and Wendy Schmidt Transformative Technology Fund. The theory effort was also supported by the National Natural Science Foundation of China under Project 11774317 (JL).
<urn:uuid:82770863-5773-40e0-be22-1425046bd22f>
CC-MAIN-2021-04
https://cefr.princeton.edu/news/mysterious-majorana-quasiparticle-now-closer-being-controlled-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704821253.82/warc/CC-MAIN-20210127055122-20210127085122-00263.warc.gz
en
0.937663
1,805
3.65625
4
The main difference between a quantum computer and a classical one is the qubit. Qubits are like classical bits, in that they hold binary values of either 1 or 0, on or off, true or false, etc. However, qubits, being quantum objects, can be in a superposition of both states at once. The physical manifestation is often something like a particle in either a spin up or spin down state. (This is true for digital quantum computing, where a discrete state is necessary. There is also analog quantum computing, which presumably works with other properties that are more continuous.) We might write the superposition of a qubit as: meaning it can be in a superposition of both 1 and 0 at the same time. So far so boring. But if we add a second qubit and have the two interact, we now have two entangled quantum objects which, together, can be in a superposition of four different states, which we might write as: In other words, adding a second qubit doubled the number of parallel states they can collectively be in. If we add a third qubit into the mix, which also, through interaction, joins the entanglement, we get this list of states in the superposition: It’s important to understand that these are superpositions, not alternatives. The three qubits, until a measurement is done, can be in all these states at the same time. If we increase the number to ten qubits, then the overall system can be in 210, or 1024 states at the same time. (Which I won’t attempt to lay out.) The Google quantum computer that demonstrated quantum supremacy (over classical computers) was reported to have 53 qubits, which in principle meant it should have been capable of being in 253 or 9 x 1015 states concurrently. This is the power of quantum computing. It allows a level of parallel processing not possible with classical systems. A 300 qubit system would be able to be in a superposition of more states than there are particles in the observable universe. Consider this. Where are all those states? According to quantum mechanics, they’re all right there, in those 300 particles. Well, at least under interpretations that consider the wave function to model something real. The question is, under the interpretations that don’t, how do they account for these kinds of systems? One thing I’ve read indicates that maybe the systems aren’t really running in parallel. Maybe they’re just executing a far more clever algorithm, and the wave function mathematics are just a convenient mechanism to keep track of it. This move seems, to me, increasingly dubious as the number of qubits increase. The interesting question is, what happens when the overall system is measured? In all interpretations, that act only provides access to one of the states, with no control over which one. A successful quantum circuit has to promote the desired answer so that all its states have it as the end result. But it’s interesting to think about what happens under each interpretation. Before doing so, it’s worth noting the raw physics of the situation. When a measurement begins, the quantum particles / waves / objects in the measuring device interact with the quantum objects, the qubits, in the quantum circuit. There’s no real distinction between the atoms in the quantum circuitry and the ones in the measuring system. In most interpretations, what changes are the sheer number of interactions involved. Under the Copenhagen interpretation, the involvement of macroscopic classical mechanisms cause the massive superposition of states to collapse to one classical state, although Copenhagen seems agnostic on the exact mechanisms. Various physical collapse interpretations see the wave physically reducing to a single state. Under the pilot-wave interpretation, there were always waves and particles, with the waves guiding the particles, and interaction with the environment causes the wave to lose coherence so that the actual particle states are now accessible. (At least I think that’s the way it would work under pilot-wave.) The sequence under relational quantum mechanics (RQM) seems particularly interesting. If I’m understanding it correctly, each interaction results in a collapse, but only relative to a particular system. So from the second qubit’s perspective, its interaction with the first qubit causes it to collapse. But from the third qubit’s perspective, the first two qubits are in superposition until the interactions reach it. This sequence of disagreements continue all the way through the sequence. Of course, from the measuring device’s perspective, nothing has collapsed until it interacts with the system. This seems similar to the sequence under the relative state formulation, also known as the many-worlds interpretation (MWI). The difference is under this interpretation, the disagreements are resolved into an objective reality. Of course, the only way to resolve them is to have a copy of qubit 2 seeing qubit 1 in its 0 state, and another copy seeing it in its 1 state. All of these copies exist in their own branch of the superposition. Under both RQM and MWI, nothing fundamental changes on the event we label as “measurement.” The physical processes just cascade into a larger environment. Under RQM, this is handled by the stipulation that all states are only meaningful relative to a particular system, and that no universal description is possible. MWI instead simply sees the superpositions continue to cascade out in an unending process. As the number of quantum objects involved skyrocket, the phase relation between the branches of the superposition that allowed for interference between them, begins to alter. As the number of constituents increase, each branch’s phase increasingly becomes more unique, isolated from the others, until they no longer interfere with each other. Each becomes causally isolated, their own separate world. Some quantum computational theorists see the success of quantum computing as evidence for the MWI. Others point out that each of the other interpretations can provide an accounting. What that success does seem to do is put pressure on the interpretations that have an anti-real stance toward the wave function. As noted above, the idea that those computations aren’t physically happening in parallel somewhere seems dubious. Unless of course, in my admittedly very amateurish musings here, I’ve missed something. In particular, is there a stronger anti-real account that I’m overlooking? Are there problems with the other interpretations that do take a realist stance?
<urn:uuid:99d01990-3010-4583-95f6-d9d02f63ff60>
CC-MAIN-2021-04
https://selfawarepatterns.com/2020/10/11/thoughts-about-quantum-computing-and-the-wave-function/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703505861.1/warc/CC-MAIN-20210116074510-20210116104510-00664.warc.gz
en
0.934316
1,345
3.515625
4
The technology behind the quantum computers of the future is fast developing, with several different approaches in progress. Many of the strategies, or “blueprints,” for quantum computers rely on atoms or artificial atom-like electrical circuits. In a new theoretical study in the journal Physical Review X, a group of physicists at Caltech demonstrates the benefits of a lesser-studied approach that relies not on atoms but molecules. “In the quantum world, we have several blueprints on the table and we are simultaneously improving all of them,” says lead author Victor Albert, the Lee A. DuBridge Postdoctoral Scholar in Theoretical Physics. “People have been thinking about using molecules to encode information since 2001, but now we are showing how molecules, which are more complex than atoms, could lead to fewer errors in quantum computing.” At the heart of quantum computers are what are known as qubits. These are similar to the bits in classical computers, but unlike classical bits they can experience a bizarre phenomenon known as superposition in which they exist in two states or more at once. Like the famous Schrödinger’s cat thought experiment, which describes a cat that is both dead and alive at the same time, particles can exist in multiple states at once. The phenomenon of superposition is at the heart of quantum computing: the fact that qubits can take on many forms simultaneously means that they have exponentially more computing power than classical bits. But the state of superposition is a delicate one, as qubits are prone to collapsing out of their desired states, and this leads to computing errors. “In classical computing, you have to worry about the bits flipping, in which a ‘1’ bit goes to a ‘0’ or vice versa, which causes errors,” says Albert. “This is like flipping a coin, and it is hard to do. But in quantum computing, the information is stored in fragile superpositions, and even the quantum equivalent of a gust of wind can lead to errors.” However, if a quantum computer platform uses qubits made of molecules, the researchers say, these types of errors are more likely to be prevented than in other quantum platforms. One concept behind the new research comes from work performed nearly 20 years ago by Caltech researchers John Preskill, Richard P. Feynman Professor of Theoretical Physics and director of the Institute of Quantum Information and Matter (IQIM), and Alexei Kitaev, the Ronald and Maxine Linde Professor of Theoretical Physics and Mathematics at Caltech, along with their colleague Daniel Gottesman (Ph.D. ’97) of the Perimeter Institute in Ontario, Canada. Back then, the scientists proposed a loophole that would provide a way around a phenomenon called Heisenberg’s uncertainty principle, which was introduced in 1927 by German physicist Werner Heisenberg. The principle states that one cannot simultaneously know with very high precision both where a particle is and where it is going. “There is a joke where Heisenberg gets pulled over by a police officer who says he knows Heisenberg’s speed was 90 miles per hour, and Heisenberg replies, ‘Now I have no idea where I am,'” says Albert. The uncertainty principle is a challenge for quantum computers because it implies that the quantum states of the qubits cannot be known well enough to determine whether or not errors have occurred. However, Gottesman, Kitaev, and Preskill figured out that while the exact position and momentum of a particle could not be measured, it was possible to detect very tiny shifts to its position and momentum. These shifts could reveal that an error has occurred, making it possible to push the system back to the correct state. This error-correcting scheme, known as GKP after its discoverers, has recently been implemented in superconducting circuit devices. “Errors are okay but only if we know they happen,” says Preskill, a co-author on the Physical Review X paper and also the scientific coordinator for a new Department of Energy-funded science center called the Quantum Systems Accelerator. “The whole point of error correction is to maximize the amount of knowledge we have about potential errors.” In the new paper, this concept is applied to rotating molecules in superposition. If the orientation or angular momentum of the molecule shifts by a small amount, those shifts can be simultaneously corrected. “We want to track the quantum information as it’s evolving under the noise,” says Albert. “The noise is kicking us around a little bit. But if we have a carefully chosen superposition of the molecules’ states, we can measure both orientation and angular momentum as long as they are small enough. And then we can kick the system back to compensate.” Jacob Covey, a co-author on the paper and former Caltech postdoctoral scholar who recently joined the faculty at the University of Illinois, says that it might be possible to eventually individually control molecules for use in quantum information systems such as these. He and his team have made strides in using optical laser beams, or “tweezers,” to control single neutral atoms (neutral atoms are another promising platform for quantum-information systems). “The appeal of molecules is that they are very complex structures that can be very densely packed,” says Covey. “If we can figure out how to utilize molecules in quantum computing, we can robustly encode information and improve the efficiency in which qubits are packed.” Albert says that the trio of himself, Preskill, and Covey provided the perfect combination of theoretical and experimental expertise to achieve the latest results. He and Preskill are both theorists while Covey is an experimentalist. “It was really nice to have somebody like John to help me with the framework for all this theory of error-correcting codes, and Jake gave us crucial guidance on what is happening in labs.” Says Preskill, “This is a paper that no one of the three of us could have written on our own. What’s really fun about the field of quantum information is that it’s encouraging us to interact across some of these divides, and Caltech, with its small size, is the perfect place to get this done.” The Physical Review X study is titled “Robust encoding of a qubit in a molecule.” More information: Victor V. Albert et al. Robust Encoding of a Qubit in a Molecule. Physical Review X (2020). DOI: 10.1103/PhysRevX.10.031050 Image: In a new theoretical study, Caltech physicists have shown how molecules can, in theory, be used to reduce errors in quantum computing. This strategy would involve placing a rotating molecule in “superposition,” which means that it would exist in multiple orientations at once. In this illustration, three different molecular orientations are shown at left; the drawing at far right signifies a superposition of these molecular states.
<urn:uuid:06865a4a-060a-4db7-ae8c-812a67150f9a>
CC-MAIN-2021-04
https://sciencebulletin.org/a-molecular-approach-to-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703514796.13/warc/CC-MAIN-20210118123320-20210118153320-00065.warc.gz
en
0.948933
1,480
3.75
4
A new method of implementing an ‘unbreakable’ quantum cryptographic system is able to transmit information at rates more than ten times faster than previous attempts. Researchers have developed a new method to overcome one of the main issues in implementing a quantum cryptography system, raising the prospect of a useable ‘unbreakable’ method for sending sensitive information hidden inside particles of light. By ‘seeding’ one laser beam inside another, the researchers, from the University of Cambridge and Toshiba Research Europe, have demonstrated that it is possible to distribute encryption keys at rates between two and six orders of magnitude higher than earlier attempts at a real-world quantum cryptography system. The results are reported in the journal Nature Photonics. Encryption is a vital part of modern life, enabling sensitive information to be shared securely. In conventional cryptography, the sender and receiver of a particular piece of information decide the encryption code, or key, up front, so that only those with the key can decrypt the information. But as computers get faster and more powerful, encryption codes get easier to break. Quantum cryptography promises ‘unbreakable’ security by hiding information in particles of light, or photons, emitted from lasers. In this form of cryptography, quantum mechanics are used to randomly generate a key. The sender, who is normally designated as Alice, sends the key via polarised photons, which are sent in different directions. The receiver, normally designated as Bob, uses photon detectors to measure which direction the photons are polarised, and the detectors translate the photons into bits, which, assuming Bob has used the correct photon detectors in the correct order, will give him the key. The strength of quantum cryptography is that if an attacker tries to intercept Alice and Bob’s message, the key itself changes, due to the properties of quantum mechanics. Since it was first proposed in the 1980s, quantum cryptography has promised the possibility of unbreakable security. “In theory, the attacker could have all of the power possible under the laws of physics, but they still wouldn’t be able to crack the code,” said the paper’s first author Lucian Comandar, a PhD student at Cambridge’s Department of Engineering and Toshiba’s Cambridge Research Laboratory. However, issues with quantum cryptography arise when trying to construct a useable system. In reality, it is a back and forth game: inventive attacks targeting different components of the system are constantly being developed, and countermeasures to foil attacks are constantly being developed in response. The components that are most frequently attacked by hackers are the photon detectors, due to their high sensitivity and complex design – it is usually the most complex components that are the most vulnerable. As a response to attacks on the detectors, researchers developed a new quantum cryptography protocol known as measurement-device-independent quantum key distribution (MDI-QKD). In this method, instead of each having a detector, Alice and Bob send their photons to a central node, referred to as Charlie. Charlie lets the photons pass through a beam splitter and measures them. The results can disclose the correlation between the bits, but not disclose their values, which remain secret. In this set-up, even if Charlie tries to cheat, the information will remain secure. MDI-QKD has been experimentally demonstrated, but the rates at which information can be sent are too slow for real-world application, mostly due to the difficulty in creating indistinguishable particles from different lasers. To make it work, the laser pulses sent through Charlie’s beam splitter need to be (relatively) long, restricting rates to a few hundred bits per second (bps) or less. The method developed by the Cambridge researchers overcomes the problem by using a technique known as pulsed laser seeding, in which one laser beam injects photons into another. This makes the laser pulses more visible to Charlie by reducing the amount of ‘time jitter’ in the pulses, so that much shorter pulses can be used. Pulsed laser seeding is also able to randomly change the phase of the laser beam at very high rates. The result of using this technique in a MDI-QKD setup would enable rates as high as 1 megabit per second, representing an improvement of two to six orders of magnitude over previous efforts. “This protocol gives us the highest possible degree of security at very high clock rates,” said Comandar. “It could point the way to a practical implementation of quantum cryptography.” The Latest on: Quantum cryptography via Google News The Latest on: Quantum cryptography - IIT Guwahati Scientists gain international recognition for their work on Quantum Entanglementon January 20, 2021 at 6:08 pm A research team at IIT Guwahati, led by Prof. Amarendra Kumar Sarma, Professor, Department of Physics, have studied the workings of quantum entanglement, a phenomenon that continues to ... - Securing the DNS in a Post-Quantum World: New DNSSEC Algorithms on the Horizonon January 18, 2021 at 4:00 pm One of the "key" questions cryptographers have been asking for the past decade or more is what to do about the potential future development of a large-scale quantum computer. If theory holds, a ... - Danish group launches €3 million quantum communication projecton January 18, 2021 at 12:14 pm CryptQ is a newly-announced Danish consortium, which is aiming to develop a cost-effective quantum-secured communication system over the next three years. Innovation Fund Denmark has invested €3 ... - Quantum Announces Appointment of Francis Bellido as CEOon January 18, 2021 at 5:21 am (GLOBE NEWSWIRE) -- Quantum Numbers Corp. (“Quantum” or the “Corporation”) (TSX-V: QNC) is pleased to announce the appointment of Mr. Francis Bellido as Chief Executive Officer (“CEO”). With its ... - Quantum Announces Closing of Private Placementon January 15, 2021 at 3:52 pm (GLOBE NEWSWIRE) -- Quantum Numbers Corp. (“Quantum” or the “Corporation”) (TSX-V: QNC) is pleased to announce that it has closed a non-brokered private placement by issuing a total of 40,000,000 ... - Quantum Cryptography Market Key Drivers, Industry Share and Future Growth Demand Analysis by 2026on January 14, 2021 at 4:52 pm Improving network infrastructure backed by increasing demand for 5G network is anticipated to drive the global ... - Quantum Drones Take Flighton January 14, 2021 at 4:00 pm A small prototype of a drone-based quantum network has successfully relayed a quantum signal over a kilometer of free space. - Quantum Entanglement of Electrons Using Heaton January 10, 2021 at 9:11 am Quantum entanglement is key for next-generation computing and communications technology, Aalto researchers can now produce it using temperature differences. A joint group of scientists from Finland, ... - Scientists entangle atoms using heaton January 8, 2021 at 7:06 am An international team of scientists has shown that temperature differences in a superconductor can be used to trigger quantum entanglement. - Three Practical Steps To Prepare Your Business For The Quantum Threaton January 8, 2021 at 4:00 am Chances are you’ve been hearing more and more about quantum computing. In the last year alone, the U.S. government has pledged to commit more than $1 billion in funds and awards to quantum information ... via Bing News
<urn:uuid:d23dc39d-c954-449c-ab90-dcf0c04fa383>
CC-MAIN-2021-04
https://innovationtoronto.com/2016/04/laser-technique-promises-super-fast-super-secure-quantum-cryptography/?share=telegram
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703524270.28/warc/CC-MAIN-20210121070324-20210121100324-00265.warc.gz
en
0.931611
1,610
3.75
4
Quantum Computing Is a Bigger Deal Than the Internet A Paper published in the journal Nature on October 23, researchers reported that the group supporting Google’s quantum computer”Sycamore” were able to use their machine to address a problem in only 200 minutes. This was not just any difficulty — it had been one so tough it could have taken the world’s strongest traditional supercomputer within 10,000 years to finish. This Is only a very small fraction of what quantum computing might accomplish. For Hundreds of thousands of years, the only real tools humans had were stones, our brains, and fire. But the best tool we have ever invented is the computer. In the very small span of time extending from the mid-20th century to the present, we have entered a realm of exponential progress as processing power approximately doubles every few years. Computers Are essentially a collection of simple elements that each have defined responsibilities: memory card storage, processing information via logic and math, and a means to control all of it through directions. A computer processor is one of the most fundamental components. Every chip has different modules that each do something special. Each module has logic gates that are made of transistors. Transistors are the 1 or 0″pieces,” off or on. A lot of transistors make up the logic gates, which allow for combinations that can do more advanced operations like multiplication and division. With a lot of these, you can calculate a lot of information, which now lets us do important work like mathematics fiction and… video games! Right Today, a transistor can be about 40 nanometers or smaller, nearly 500 times smaller than an average cell in your system. Transistors Are switches which turn off the stream of electrons. Right now, a transistor may be about 40 nanometers or smaller, nearly 500 times bigger than an ordinary cell within your body. Basically, At this level, the electrons do not have to stream — they can simply proceed using”quantum tunneling.” So To take advantage of physics in a quantum level, we’re producing quantum computers. Rather than using bits as our tiniest unit of information, we finally have qubits. Beyond this, in quantum physics that the countries don’t have to be just on or off / yes or no — they can also reap the benefits of”superposition,” a quantum property which allows a particle to be in any combination or proportion of these countries. Like Schrodinger’s Cat, the particle could be anything, but if you really test or observe it, it will only be one condition. So once you’re not celebrating it, the particle can be both partially vertically and horizontally polarized. But while you check on it, the particle is only going to show you one of those countries. What Superposition truly means is that we now have a radically increased number of possible combinations. In regular computing, 4 pieces yields 16 total possible combinations, but you can just use one of them. But, 4 qubits can actually store all 16 of those values at the same time. Another Awesome property that qubits can display is quantum entanglement, in which two qubits are strangely connected and react to another’s conditions, however far apart they could be from the physical universe. With this property, we can quantify one qubit and be able to know the properties of its entangled qubit in the exact same time. A Quantum internet will greatly increase information access and allow dispersed computational attempts to achieve even greater heights. If You will allow me a tangent, quantum entanglement has also enabled research into quantum teleportation. By taking formerly entangled particles and putting them in various locations, we could use traditional communication methods to send the conditions of particle into the entangled partner no matter how far apart they could be. And Yet another property we can make the most of is called qubit manipulation. Our routine calculating logic gates get a pair of inputs and provide us a single output. A”quantum gate” takes an input signal of superpositioned qubits, rotates probabilities, and sparks a new superposition. Now the qubits can be measured and also we get the 0s and 1s which represent the data we need. The important thing here is that each one of the possible replies are generated at precisely the same time, not just the single output in a conventional logic gate. The answer we get is likely correct, but there’s a very slight chance it may not be. But since all of the possibilities have been created, it’s quick work to experience the rest until we receive the exact right one. Though Not perfect, what actually makes quantum computing particular beyond storage capacity is how efficient and fast it is. One great application of this can be databases. We can now save a stunningly massive number of information and also search through it much quicker than with traditional computing. “It’s My private belief that quantum computing can help us make sense of this deluge of data we find ourselves creating to fix some rather interesting problems. You will find systems generating countless information sets daily, and those might be the answer to a Essential problems affecting society…” William Hurley, seat of the Quantum Computing Standards Workgroup at the Institute of Electrical and Electronics Engineers (IEEE) Quantum Computing can also generate huge quantities of calculations and probabilities at amazing rates, which also rewards simulations. These quantum simulations will help us in research on climate, genetics and disease, quantum physics, and generally anything that requires massive amounts of number crunching. Since a Quantum internet will improve data access and allow dispersed computational efforts to reach even higher heights. One Negative impact of quantum computing is that it vastly increases the rate at which someone can decode passwords or other security measures, compared to brute force attempts utilizing a traditional computer. We Want a new paradigm for our advancement to continue, and quantum computing can it be. We likely won’t see quantum computers households everywhere soon, but scientists and researchers are already using them for large-scale projects. The Information Age has been a hugely profitable time for our planet: The power of computing has led to amazing advances in most fields of human effort, while also greatly contributing to raising the standard of living for most people. We now generate more new information and knowledge annually than we’ve recorded in most of human history. However, as we progress ever further in the ability of those artificial minds, we are playing with a tool that’s more powerful and dangerous in some ways than atomic power. Quantum Computing may get rid of any conceivable space that A.I. may face between its current condition and the”singularity” — the stage in the future when A.I. will become self-aware. But in the incorrect hands, quantum computing could lead to genetic tampering that may produce super-soldiers or super-diseases. We Have to continue to push whole steam ahead on research in order that we can comprehend These risks, while also benefiting from the unique advantages of quantum computing. You May Also Like Lakota, now running for almost three decades, is an Association in Bristol. The mythical place on the corner of Upper Yo ... Over the past 15 years, the United States military has developed a new addition to its arsenal. The weapon is set up aro ... He so-called”Flowing wars” Warmed once More this Week, with AT&T announcing more information about HBO M ...
<urn:uuid:2f4830c2-a857-4056-8eda-83a1cafbe28e>
CC-MAIN-2021-04
https://guteblog.themesvillage.com/demo1/quantum-computing-is-a-bigger-deal-than-the-internet/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703581888.64/warc/CC-MAIN-20210125123120-20210125153120-00666.warc.gz
en
0.930189
1,549
3.5
4
- Quantum computing - Quantum teleportation - Quantum cryptography - Sources for single or entangled photons It was only short time after the formulation and acceptance of quantum theory when scientists started to discuss possible benefits of this theory for mankind. The quantum computer, probably the most famous application of quantum theory, is expected to reach incredible computing speeds that enable calculations which were not possible before. Any coupled quantum mechanical system can be used for quantum computing. Solid state systems, trapped ions, atoms in optical lattices, and photons with linear optical elements are at the heart of quantum computer research. First quantum operations have been demonstrated with solid state systems and trapped ions but the race is still open. Basis for quantum technologies The basis for quantum computing is “entanglement”, a quantum mechanical property of a system in which the state of one part of the system is fully linked to the state of another part. The famous “Schrödinger cat” example tries to visualize how strange entanglement is compared to experiences in daily life. Even Einstein doubted this property so much that he and his colleagues Podolski and Rosen published an article in 1935 in which they thought to proof that quantum theory cannot be complete and would have to be substituted by another theory including variables that in quantum theory are still “hidden”. Their “EPR paradox” argument was first theoretically falsified by Bell (“Bell’s theorem”) who showed that quantum mechanics is indeed complete. Until today, Bells theorem was experimentally supported many times. No hidden variables are needed to describe the quantum nature completely. The strange property entanglement is also the basis for quantum teleportation – where one transfers a quantum mechanical state from one system at one place to another system at another place - and quantum cryptography. The goal of the latter is to send information from one place to another in a completely secure way. Obviously, a quantum cryptography apparatus would be a very powerful and important instrument. Quantum cryptography relies mostly on single on entangled photons and is already commercialized. High speed with Quantum computing Quantum computing is expected to allow for calculations, simulations or operations at a speed that classical computing can never reach. For example, it was theoretically shown that a quantum computer would be able to perform database searches or factorization of large numbers much faster than classical computers. The enormous calculation power of a quantum computer is a consequence of two main ingredients. First of all, the fundamental piece of information is a quantum mechanical two state system (|0> and |1>) called QuBit that – unlike a classical bit which is either 0 or 1 – can be in any superposition (a|0> + b|1>) of the two states. Second, the basic calculations are coherent operations that act on such a superposition state. This way, all possible realizations of anything between |0> and |1> can be computed simultaneously and highly parallel computation is realized. Gate operations, the fundamental operations of computing, were shown with trapped ions and with photon based quantum computers. Using solid state systems (NMR), a proof of principle for quantum computed factorization of the number 15 was demonstrated. Object transferation with quantum teleportation Quantum teleportation is referring to a procedure in which the quantum mechanical state of one object is fully transferred to another object at a different place. It makes use of the non-locality of entanglement that confused not only Einstein. Using a clever sequence of measurements and entanglement operations on photons, the polarization state of one photon could be mapped to another photon completely. Just recently, quantum teleportation between distant matter QuBits was shown using two separate ion traps. Closely related to quantum teleportation and quantum computing is the so-called “quantum logic”. Here, depending on the quantum state of one object a specific state of another object is created. This controlled state preparation was used in metrology to realize one of the best atomic clocks in the world based on aluminum ions. Secure communication with quantum crytography Quantum cryptography uses quantum physics properties like entanglement and back action of the measurement process on a quantum state to achieve secure communication between a sender (Alice) and a receiver (Bob). The standard approach is that Alice and Bob perform measurements on entangled quantum systems, usually entangled photons, in order to create a key for Alice and Bob. Since they can then use this code to encrypt and decrypt the real message, the quantum cryptography method is called quantum key distribution. The real message is encrypted by Alice according to her measurement results and sent through an open channel (so anyone is allowed to “listen”) to Bob who decrypts the message according to his measurements. Any eavesdropping, so any attempt of a third party to detect the quantum key, can be detected because according to quantum physics laws each measurement influences the quantum mechanical state itself. Eavesdropping would be noticed always. Due to its obvious significance, quantum cryptography research is pushed a lot and many results have been achieved so far. Quantum key distribution over hundreds of km in fiber or over a whole city in free space was shown already while satellite-links of entangled photons between earth stations are currently explored. To proof the usability, a quantum encrypted bank transaction was undertaken. Important tools for quantum computing & cryptography Sources for single or entangled photons are important tools for quantum computing and quantum cryptography. Single photon sources emitting exactly one photon at a triggered time can be realized in many ways incorporating e.g. color centers or ions in solids, single atoms in traps or optical cavities, trapped ions or quantum dot systems. The most common source for entangled photons is based on spontaneous parametric down conversion. A “blue” photon is converted into two red photons within a non-linear optical crystal. Polarization, momentum and energy of the two photons are strongly correlated. A lot of research on this topic is under way. Main efforts are focused on the development of efficient – ideally full deterministic – sources and realizations with mass production potential. TOPTICA’s added value TOPTICA is a highly appreciated supplier for quantum information experiments that involve trapped ions or atoms. Our lasers are successfully applied to cool, trap, optically pump or coherently manipulate ions and atoms. They are fabricated or tuned to the required wavelength such that they can be used to excite single photon emitters. To create entangled photon pairs by parametric down conversion one needs a fundamental laser at half the wavelength of the photon pair in order to initiate the conversion process. Frequently, entangled photons in the near infrared around 800 nm are used and hence violet lasers around 400 nm are required. The development and fabrication of lasers in the UV is TOPTICA’s core competence. We were the first company to produce diode laser systems in the UV and offer a variety of systems with different linewidth/coherence characteristics and power levels for scientific research and industry. No other company has a similar product portfolio. Please contact us to find the best laser for your application.
<urn:uuid:bdfe7b09-e82d-4648-9265-3cd5bec6d726>
CC-MAIN-2021-04
https://www.toptica.com/ja/applications/applied-quantum-technology/communication/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704800238.80/warc/CC-MAIN-20210126135838-20210126165838-00267.warc.gz
en
0.934916
1,434
3.828125
4
17 Jun Sampling photons to simulate molecules • Physics 13, 97 A quantum simulator uses microwave photons to tackle a useful chemistry problem—determining the vibronic spectra of molecules. In 2019, researchers claimed they had achieved an important milestone in quantum computing—the demonstration that a programmable quantum computer can outperform the most powerful classical computer in a specific task . Using Google’s 53-qubit Sycamore quantum processor, they carried out, in just over three minutes, a “sampling” operation that could have taken, according to their estimates, thousands of years on a classical supercomputer. The task, which consisted of sampling the outputs of a random quantum circuit, had previously been identified as a promising testbed for demonstrating this quantum superiority . Alas, sampling isn’t generally linked to applications of any practical relevance. Now, experiments by Christopher Wang at Yale University and colleagues show that a superconducting quantum device manipulating microwave photons can tackle a useful sampling problem—determining the so-called vibronic spectra of small molecules . While the scheme doesn’t yet achieve a quantum advantage, it holds great potential for doing so if further scaled up. In a vibronic transition, the absorption of a photon by a molecule results in the simultaneous change of both vibrational and electronic energy levels. These transitions are relevant to many important photoinduced molecular processes, including light absorption and emission, photoelectron emission, and Raman scattering. Theoretically, the intensities of these transitions depend on Franck-Condon factors, which quantify the transition probability based on the overlap between the wave functions of the initial and final vibrational states. While there is no definitive mathematical proof that a classical computer can’t reliably calculate these quantities, we know that classical algorithms are inefficient at this task. The difficulty stems from the fact that each vibrational mode of a molecule can’t be considered as an ideal, independent harmonic oscillator—one mode might be coupled to several other modes, for instance. As a consequence, computational complexity rises rapidly with the number of atoms, and even relatively small molecules can be hard to model. In 2014, Harvard University researchers proposed that the computation of vibronic spectra can be viewed as a sampling problem, which could be simulated with a quantum setup . The result was based on an approach developed in 1977 by theorist Evgeny Doktorov and co-workers, who showed that three physical processes involved in the vibronic transition—molecular structural deformation, vibrational frequency changes, and mixing of vibrational modes—can be interpreted in terms of three quantum optical operators: displacement, squeezing, and rotation, respectively . Using this approach, the Harvard team showed that calculating the vibronic spectrum of a molecule is equivalent to solving a “boson sampling” task . Boson sampling consists in sampling the probability distribution of photons at the output of an optical network. This idea inspired experimental implementations both with a quantum optical setup [6, 7] and with trapped ions . However, scaling up these quantum simulators to solve meaningful problems faces formidable challenges. Specifically, it requires the generation of hard-to-prepare squeezed states—in which quantum fluctuations in the photon number are reduced compared to conventional coherent light—as well as the simultaneous measurement of the quantum states of a large number of photons. Following previous theoretical work , Wang and co-workers have now demonstrated experimentally another approach to vibronic-spectra computation based on superconducting microwave circuits. This solution overcomes most of the above-mentioned hurdles by exploiting the remarkable degree of control and tunability of superconducting microwave circuits. Their quantum simulator design consists of an array of superconducting microwave cavities, or resonators, each of which is coupled to the others through so-called transmon qubits (Fig. 1). These qubits allow the researchers to fine-tune the coupling between the resonators. Loosely speaking, each resonator represents one vibrational mode of a molecule, while the tunable coupling mimics the interaction between the modes. In their proof-of-principle demonstration, the researchers used the photonic modes of two coupled superconducting resonators to simulate the photoelectron spectra of several simple triatomic molecules: water, ozone, nitrogen dioxide, and sulfur dioxide. By driving the transmon qubit at the appropriate frequencies, the researchers produced an interaction between resonators that implemented the rotation, displacement, and squeezing operations necessary to implement the Doktorov approach. To reconstruct the Frank-Condon profiles of the molecules, the researchers needed to sample the photons in each of the resonators without perturbing them. To do so, they improved a previous quantum nondemolition (QND) scheme. In the scheme, a photon in a cavity can be measured nondestructively through the effect it has on the transition frequency of an “ancillary” qubit coupled to the cavity . This method has proven to work well with single photons but, so far, could not measure larger photon numbers. To address this issue, Wang and collaborators came up with a clever alternative: Using sequential QND measurements, they managed to resolve up to 15 photons in each of the resonators. This number is way beyond what was possible in previous photonic and trapped ion platforms, allowing the device to carry out a task that was challenging for previous simulators—simulating the vibronic spectra of molecules that are in vibrationally excited states. The current capabilities of this quantum simulator are still far from surpassing those of classical computers for this particular chemistry problem, as the spectra for these triatomic molecules can be calculated more precisely and rapidly with conventional methods. But one can expect that further advances in superconducting circuit technology will soon allow for much more interesting simulations. If the number of anharmonically coupled resonators could be increased to more than ten, for instance, the scheme could already simulate molecules that are challenging for classical computations. More sophisticated circuits could also account for effects that are extremely hard to model classically. First, a tailored photon loss mechanism in the circuit could mimic dissipation in real molecules. Second, Kerr nonlinearity—the dependence of the refractive index on light intensity—could be introduced in the setup to simulate the anharmonic effects that lead to high-order correlations between vibrational modes. With these improvements, the setup could allow researchers to simulate a wealth of molecular processes and effects that are often beyond the reach of classical simulations, including non-Condon transitions, resonant Raman scattering, multiphoton processes, vibrational circular dichroism, conical intersections, open quantum dynamics, and many others. This research is published in Physical Review X. - F. Arute et al., “Quantum supremacy using a programmable superconducting processor,” Nature 574, 505 (2019). - S. Aaronson and A. Arkhipov, “The computational complexity of linear optics,” Proc. ACM STOC 2011 (2011); S. Boixo et al., “Characterizing quantum supremacy in near-term devices,” Nat. Phys. 14, 595 (2018). - C. S. Wang et al., “Efficient multiphoton sampling of molecular vibronic spectra on a superconducting bosonic processor,” Phys. Rev. X 10, 021060 (2020). - J. Huh et al., “Boson sampling for molecular vibronic spectra,” Nat. Photon. 9, 615 (2015). - E. V. Doktorov et al., “Dynamical symmetry of vibronic transitions in polyatomic molecules and the Franck-Condon principle,” J. Molec. Spectrosc. 64, 302 (1977). - W. R. Clements et al., “Approximating vibronic spectroscopy with imperfect quantum optics,” J. Phys. B 51, 245503 (2018). - C. Sparrow et al., “Simulating the vibrational quantum dynamics of molecules using photonics,” Nature 557, 660 (2018). - Y. Shen et al., “Quantum optical emulation of molecular vibronic spectroscopy using a trapped-ion device,” Chem. Sci. 9, 836 (2018). - B. Peropadre et al., “Proposal for microwave boson sampling,” Phys. Rev. Lett. 117, 140505 (2016). - B. R. Johnson et al., “Quantum non-demolition detection of single microwave photons in a circuit,” Nat. Phys. 6, 663 (2010).
<urn:uuid:44514c5c-389a-4da8-8d69-b1e6d1dd4a3e>
CC-MAIN-2021-04
https://fiberguide.net/tech-guides/sampling-photons-to-simulate-molecules/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703565541.79/warc/CC-MAIN-20210125092143-20210125122143-00468.warc.gz
en
0.895144
1,809
3.734375
4
Famed physicist Stephen Hawking proposed in 1974 that very small amounts of high-energy radiation, in the form of entangled particle pairs, known as Hawking radiation, could theoretically escape a black hole. This was controversial as it went against the conventional understanding that nothing, not energy or light, could escape a black hole. Since 1981, however, when physicist William Unruh discovered that fluid flows could mimic black holes, the hunt for this elusive process has driven researchers to create analogue black holes to test the possibilities of particles behaving unusually at a black hole’s event horizon. Though so far it has not been possible to create a true black hole in a lab, researchers have used sound waves to make “dumb” or acoustic black holes since 2009. In 2015, Jeff Steinhauer, a physicist at the Israel Institute of Technology in Haifa, who has been working on these black holes for the past seven years, is the first researcher to claim to have seen Hawking radiation in his lab-made, analogue black hole. Acoustic black hole use sound waves—phonons—not photons, by cooling rubidium atoms to barely above zero. The atoms then share a “quantum connection,” says Steinhauer, “so that even when they are far apart, they still have a connection to each other, known as quantum entanglement, where the pair of atoms mimic each other’s behavior, clumping up to form a Bose-Einstein condensate (BEC). He explains, “[The BEC] flows faster than the speed of sound, which means a sound wave trying to go against the flow falls back, which is analogous to a photon trying to escape a black hole.” He compares this to trying to swim against the direction of a river’s flow. “The river flows faster than you can swim, so you feel like you’re flowing forward, but you’re actually falling back.” This, he says, is like the sound waves, which are analogous to a light wave trying to escape a black hole.” Hawking radiation makes the case that the universe is teeming with entangled particles. These particle pairs each contain an electron with a negative charge and a positron with a positive charge, which pop in and out of existence. If they come in contact, they will destroy each other, with one exception: if they should happen to appear at a black hole’s event horizon. There, he theorized, one particle falls into the hole, and the other dissipates into space. This assumes that when the black hole eventually loses mass and evaporates, it also takes all the particles or “information” that fell into it with it, essentially destroying it. Steinhauer says the problem with this, which is known as the black hole information paradox, is that, “There’s a law of quantum mechanics that says information can’t be destroyed. So when information falls into a black hole, is it destroyed or does it exist in the hawking radiation coming out or somewhere else?” Hawking published a theoretical solution to this problem earlier this year, suggesting that black holes may have a halo of soft hair around them capable of storing information, but no one knows for sure. Others have theorized that the information is, in fact, contained in the Hawking radiation coming out of the black hole itself, which is what causes it to lose mass over time and dissipate. However Steinhauer argues that this conflicts with existing laws of physics, saying, “my work says there is a problem with that solution because outgoing particles are already entangled with in-falling particles, so they can’t be entangled with others,” he says. Steinhauer and his research team ran their experiment 4600 times for six days straight, taking photos each time of the BEC to form a composite known as a “correlation function.” The composite image shows a very thin gray band, which, he says “means there are correlations between a point inside the black hole and outside the black hole. Waves are emitted from the horizon. The correlations between the waves falling in and coming out lets us see it.” Moreover, it’s the fact that the particles are in pairs that allow researchers to see that line. “My theoretical paper says if the gray band is narrow, the particles are entangled. They found that only high energy pairs were entangled, while low energy pairs were not. I knew immediately they were.” Thus, he is confident that he was seeing Hawking radiation for the first time. In addition, the particles coming out of the acoustic black hole’s event horizon produced so much energy, their experiment may support “the firewall controversy”—yet another hypothesis that suggests that the effort of breaking the entanglement between the Hawking particles and their partners creates actual flames at the edge of a black hole. “So I saw that the particles really were entangled in my black hole, which implied that there really is an issue to solve and one of the possible ways would be a firewall. It would naturally occur to preserve the laws of physics,” says Steinhauer. While Steinhauer’s research team is understandably excited, the scientific community isn’t yet leaping up to confirm Hawking radiation yet, which will require further experiments to see if it can be replicated. No one has ever created an actual black hole in a lab, says Harry E. Keller, PhD, President, Chief Science Officer and Founder of Smart Science Education. Nor has anyone been able to see Hawking radiation in space in a true black hole, which would make Hawking an instant candidate for a Nobel Prize. “Were such an artifact to be made, it would be so dangerous that it could swallow up our planet along with all of us.” Instead, this BEC, which he refers to as “a new state of matter beyond the usual gas, liquid, solid, and plasma” is in his estimation not a good enough analogue of a true black hole. He compares it to “playing with magnets to figure out how planetary systems work.” He continues, “The analogue experience might mimic the quantum mechanics of black holes and Hawking radiation, or it might not,” he says. “It’s still interesting in its own right.” Steinhauer thinks it is more than a little bit interesting. “The point of seeing Hawking radiation is not to learn about black holes but to understand what the new laws of physics are.” Hawking was the first to combine gravity with quantum field theory to come up with the idea of Hawking radiation in the first place. “This combination is considered a first step on the road to a theory of quantum gravity,” Steinhauer says. “People have many ideas but nobody’s sure whose ideas are right.”
<urn:uuid:48fa2644-30df-48a8-b61d-527ece214800>
CC-MAIN-2021-04
https://secondnexus.com/science/labmade-black-hole
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703533863.67/warc/CC-MAIN-20210123032629-20210123062629-00069.warc.gz
en
0.9561
1,446
3.96875
4
Diffie–Hellman key exchange (DH) is a method of securely exchanging cryptographic keys over a public channel and was one of the first public-key protocols as conceived by Ralph Merkle and named after Whitfield Diffie and Martin Hellman. DH is one of the earliest practical examples of public key exchange implemented within the field of cryptography. Traditionally, secure encrypted communication between two parties required that they first exchange keys by some secure physical means, such as paper key lists transported by a trusted courier. The Diffie–Hellman key exchange method allows two parties that have no prior knowledge of each other to jointly establish a shared secret key over an insecure channel. This key can then be used to encrypt subsequent communications using a symmetric key cipher. DH has been widely used on the Internet for improving the authentication encryption among parties. The only note is it useful if both the communication sides A and B are at your control, as what DH does is just strenghten the already established connection between client A and B and not protect from Man in the Middle Attacks. If some malicious user could connect to B pretending it is A the encryption will be established. Alternatively, the Diffie-Hellman key exchange can be combined with an algorithm like the Digital Signature Standard (DSS) to provide authentication, key exchange, confidentiality and check the integrity of the data. In such a situation, RSA is not necessary for securing the connection. TLS, which is a protocol that is used to secure much of the internet, can use the Diffie-Hellman exchange in three different ways: anonymous, static and ephemeral. In practice, only ephemeral Diffie-Hellman should be implemented, because the other options have security issues. Anonymous Diffie-Hellman – This version of the Diffie-Hellman key exchange doesn’t use any authentication, leaving it vulnerable to man-in-the-middle attacks. It should not be used or implemented. Static Diffie-Hellman – Static Diffie-Hellman uses certificates to authenticate the server. It does not authenticate the client by default, nor does it provide forward secrecy. Ephemeral Diffie-Hellman – This is considered the most secure implementation because it provides perfect forward secrecy. It is generally combined with an algorithm such as DSA or RSA to authenticate one or both of the parties in the connection. Ephemeral Diffie-Hellman uses different key pairs each time the protocol is run. This gives the connection perfect forward secrecy, because even if a key is compromised in the future, it can’t be used to decrypt all of the past messages. DH encryption key could be generated with the openssl command and could be generated depending on your preference using a 1024 / 2048 or 4096 bit encryption. Of course it is best to have the strongest encryption possible i.e 4096. The Logjam attack The Diffie-Hellman key exchange was designed on the basis of the discrete logarithm problem being difficult to solve. The most effective publicly known mechanism for finding the solution is the number field sieve algorithm. The capabilities of this algorithm were taken into account when the Diffie-Hellman key exchange was designed. By 1992, it was known that for a given group, G, three of the four steps involved in the algorithm could potentially be computed beforehand. If this progress was saved, the final step could be calculated in a comparatively short time. This wasn’t too concerning until it was realized that a significant portion of internet traffic uses the same groups that are 1024 bits or smaller. In 2015, an academic team ran the calculations for the most common 512-bit prime used by the Diffie-Hellman key exchange in TLS. They were also able to downgrade 80% of TLS servers that supported DHE-EXPORT, so that they would accept a 512-bit export-grade Diffie-Hellman key exchange for the connection. This means that each of these servers is vulnerable to an attack from a well-resourced adversary. The researchers went on to extrapolate their results, estimating that a nation-state could break a 1024-bit prime. By breaking the single most-commonly used 1024-bit prime, the academic team estimated that an adversary could monitor 18% of the one million most popular HTTPS websites. They went on to say that a second prime would enable the adversary to decrypt the connections of 66% of VPN servers, and 26% of SSH servers. Later in the report, the academics suggested that the NSA may already have these capabilities. “A close reading of published NSA leaks shows that the agency’s attacks on VPNs are consistent with having achieved such a break.” Despite this vulnerability, the Diffie-Hellman key exchange can still be secure if it is implemented correctly. As long as a 2048-bit key is used, the Logjam attack will not work. Updated browsers are also secure from this attack. Is the Diffie-Hellman key exchange safe? While the Diffie-Hellman key exchange may seem complex, it is a fundamental part of securely exchanging data online. As long as it is implemented alongside an appropriate authentication method and the numbers have been selected properly, it is not considered vulnerable to attack. The Diffie-Hellman key exchange was an innovative method for helping two unknown parties communicate safely when it was developed in the 1970s. While we now implement newer versions with larger keys to protect against modern technology the protocol itself looks like it will continue to be secure until the arrival of quantum computing and the advanced attacks that will come with it. Here is how easy it is to add this extra encryption to make the SSL tunnel between A and B stronger. On a Linux / Mac / BSD OS machine install and use openssl client like so: # openssl dhparam -out dhparams1.pem 2048 Generating DH parameters, 2048 bit long safe prime, generator 2 This is going to take a long time Be aware that the Diffie-Hellman key exchange would be insecure if it used numbers as small as those in our example. We are only using such small numbers to demonstrate the concept in a simpler manner. # cat dhparams1.pem —–BEGIN DH PARAMETERS—– —–END DH PARAMETERS—– Copy the generated DH PARAMETERS headered key string to your combined .PEM certificate pair at the end of the file and save it # vim /etc/haproxy/cert/ssl-cert.pem —–BEGIN DH PARAMETERS—– —–END DH PARAMETERS—– Restart the WebServer or Proxy service wher Diffie-Hellman key was installed and Voila you should a bit more secure.
<urn:uuid:3dfb58f3-9188-426f-a856-c0abbe63b785>
CC-MAIN-2021-04
https://pc-freak.net/blog/improve-ssl-security-generate-add-diffie-hellman-key-ssl-certificate-stronger-line-encryption/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703515075.32/warc/CC-MAIN-20210118154332-20210118184332-00270.warc.gz
en
0.89189
1,417
4.1875
4
What is quantum physics? Put simply, it’s the physics that explains how everything works: the best description we have of the nature of the particles that make up matter and the forces with which they interact. Quantum physics underlies how atoms work, and so why chemistry and biology work as they do. You, me and the gatepost – at some level at least, we’re all dancing to the quantum tune. If you want to explain how electrons move through a computer chip, how photons of light get turned to electrical current in a solar panel or amplify themselves in a laser, or even just how the sun keeps burning, you’ll need to use quantum physics. The difficulty – and, for physicists, the fun – starts here. To begin with, there’s no single quantum theory. There’s quantum mechanics, the basic mathematical framework that underpins it all, which was first developed in the 1920s by Niels Bohr, Werner Heisenberg, Erwin Schrödinger and others. It characterises simple things such as how the position or momentum of a single particle or group of few particles changes over time. But to understand how things work in the real world, quantum mechanics must be combined with other elements of physics – principally, Albert Einstein’s special theory of relativity, which explains what happens when things move very fast – to create what are known as quantum field theories. Three different quantum field theories deal with three of the four fundamental forces by which matter interacts: electromagnetism, which explains how atoms hold together; the strong nuclear force, which explains the stability of the nucleus at the heart of the atom; and the weak nuclear force, which explains why some atoms undergo radioactive decay. Over the past five decades or so these three theories have been brought together in a ramshackle coalition known as the “standard model” of particle physics. For all the impression that this model is slightly held together with sticky tape, it is the most accurately tested picture of matter’s basic working that’s ever been devised. Its crowning glory came in 2012 with the discovery of the Higgs boson, the particle that gives all other fundamental particles their mass, whose existence was predicted on the basis of quantum field theories as far back as 1964. Conventional quantum field theories work well in describing the results of experiments at high-energy particle smashers such as CERN’s Large Hadron Collider, where the Higgs was discovered, which probe matter at its smallest scales. But if you want to understand how things work in many less esoteric situations – how electrons move or don’t move through a solid material and so make a material a metal, an insulator or a semiconductor, for example – things get even more complex. The billions upon billions of interactions in these crowded environments require the development of “effective field theories” that gloss over some of the gory details. The difficulty in constructing such theories is why many important questions in solid-state physics remain unresolved – for instance why at low temperatures some materials are superconductors that allow current without electrical resistance, and why we can’t get this trick to work at room temperature. But beneath all these practical problems lies a huge quantum mystery. At a basic level, quantum physics predicts very strange things about how matter works that are completely at odds with how things seem to work in the real world. Quantum particles can behave like particles, located in a single place; or they can act like waves, distributed all over space or in several places at once. How they appear seems to depend on how we choose to measure them, and before we measure they seem to have no definite properties at all – leading us to a fundamental conundrum about the nature of basic reality. This fuzziness leads to apparent paradoxes such as Schrödinger’s cat, in which thanks to an uncertain quantum process a cat is left dead and alive at the same time. But that’s not all. Quantum particles also seem to be able to affect each other instantaneously even when they are far away from each other. This truly bamboozling phenomenon is known as entanglement, or, in a phrase coined by Einstein (a great critic of quantum theory), “spooky action at a distance”. Such quantum powers are completely foreign to us, yet are the basis of emerging technologies such as ultra-secure quantum cryptography and ultra-powerful quantum computing. But as to what it all means, no one knows. Some people think we must just accept that quantum physics explains the material world in terms we find impossible to square with our experience in the larger, “classical” world. Others think there must be some better, more intuitive theory out there that we’ve yet to discover. In all this, there are several elephants in the room. For a start, there’s a fourth fundamental force of nature that so far quantum theory has been unable to explain. Gravity remains the territory of Einstein’s general theory of relativity, a firmly non-quantum theory that doesn’t even involve particles. Intensive efforts over decades to bring gravity under the quantum umbrella and so explain all of fundamental physics within one “theory of everything” have come to nothing. Meanwhile cosmological measurements indicate that over 95 per cent of the universe consists of dark matter and dark energy, stuffs for which we currently have no explanation within the standard model, and conundrums such as the extent of the role of quantum physics in the messy workings of life remain unexplained. The world is at some level quantum – but whether quantum physics is the last word about the world remains an open question. Richard Webb
<urn:uuid:1ff98300-6cbc-41f2-95f4-7056c4f63f74>
CC-MAIN-2021-04
https://www.newscientist.com/term/quantum-physics/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704798089.76/warc/CC-MAIN-20210126042704-20210126072704-00672.warc.gz
en
0.9387
1,181
3.734375
4
The recent trends in information technology and communications have emerged as one of the main technological pillars of the modern age. The importance of cryptography has gained importance due to the requirement of security services (confidentiality, integrity, authenticity, and non-repudiation) in data storage/transmission. Quantum computing, first introduced as a concept in 1982, has now become a nightmare for the currently deployed cryptographic mechanism. Extensive research has been done on quantum platforms to resolve complex mathematical problems, which are intractable for traditional computing platforms. The formalization of such quantum computing platforms poses serious threats to the cryptographic algorithms. This article informs the reader about the implications/repercussions of quantum computing on the present cryptography in detail. Types of Cryptographic Algorithms Three categories of cryptographic algorithms exist based on the number of cryptographic keys required as input for the algorithm. - No Key - Hash Functions - One Key - Symmetric Algorithms - Two Keys - Asymmetric Algorithms Hash algorithms transform a large random size input to a small fixed size output. The output calculated by the hash algorithm is referred to as a digest or hash value. Operation of the hash algorithms does not require any cryptographic key and securely operates in a one-way manner. The one-way process means that it is cryptographically and technically impossible to compute the input data from output data. There are two categories of hash algorithms based on their design: - Hash algorithms based on Mathematical Problems: In the first category are those functions whose designs are based on a mathematical problem, and thus their security follows from rigorous mathematical proofs, complexity theory, and formal reduction. These functions are called Provably Secure Cryptographic Hash Functions. However, this does not mean that such a function could not be broken. Constructing them is very difficult, and only a few examples were introduced. Therefore, their practical use is limited. - Hash Algorithms based on Confusion/Diffusion: In the second category are functions that are not based on mathematical problems, but on an ad hoc basis, where the bits of the message are mixed to produce the hash. They are then believed to be hard to break, but no such formal proof is given. Almost all widely spread hash functions fall in this category. Some of these functions are already broken and are no longer in use. Symmetric algorithms are also known as secret key algorithms that employ one single cryptographic key for encryption/decryption mechanisms. Only the sender and receiver know the symmetric key. The further categorization of symmetric algorithms includes: - Block Algorithms: A block algorithm breaks the input into fixed-size blocks and then process the crypto operations. Popular block algorithms are the Advanced Encryption Standard (AES) and the Data Encryption Standard (3DES). - Stream Algorithms: Stream algorithms perform “bit-by-bit” crypto operations. The most commonly used stream algorithms are RC4, A5/1, A5/2, and Chameleon, etc. Current Security of Symmetric & Hash Algorithms The security of the symmetric and hash algorithms is based on the fact that the key range is extensive and a brute force attack (attempting all the possible/potential keys) is not possible because of limited computational power and time constraints. The Advent of Quantum Computing Quantum computers have threatened the cryptographic mechanisms behind the current secure communication standards and protocols because quantum computers can perform calculations/computations at a rate that cannot be achieved through conventional/traditional computing systems. Traditional computing systems are based on the vital blocks known as bits and can have only two states, 0 and 1. Quantum computing platforms are based on quantum bits, also known as qubits. Qubits can hold the state 0, 1, and both simultaneously. This property is known as superposition. The effect of quantum computing can be calculated in two directions: - The solution of Complex/Hard problems: Due to the advantage of the superposition property, quantum computing platforms can solve hard and complex mathematical problems that are the security standpoints of various cryptographic algorithms. - The exponential increase in Calculations: Some cryptographic algorithms, such as symmetric and hash are based on the fact that a brute-force attack is infeasible. Quantum computers can definitely affect these algorithms by exhaustively searching for all secret keys. Quantum Platforms & Grover’s Algorithm The main threat to the security of symmetric and hash algorithms is Grover’s algorithm. This algorithm utilizes the quantum computing platform to search through unsorted databases to find a particular entry in √N searches from an unsorted DB of N entries. Meanwhile, traditional computing platforms search for the same in N/2 searches. Threats to Symmetric Algorithms from Quantum Computing The Grover’s algorithm implementation on quantum platforms poses a serious threat to symmetric key algorithms by accelerating the speed of an exhaustive key search attack or brute force attack on symmetric algorithms so that the cryptographic key length is reduced by 50%. For an n-bit symmetric cryptographic algorithm, 2n possible keys exist. For the 128-bit AES algorithm, the key range is 2128, which is unbreakable using current computing platforms. After the formalization of a quantum platform and the implementation of Grover’s algorithm, the AES 128-bit key size will be reduced to an insecure 64-bit equivalent key length just like the 64-bit DES algorithm. Luckily, AES supports two other key lengths, and the applications would have to switch to the 192-bit and 256-bit versions of AES algorithms. Threats to Hash Algorithms from Quantum Computing Hash algorithms will also suffer from Grover’s Algorithm because they produce a fixed-size output of any random-sized input. The augmented speed of Grover’s algorithm can be used to expedite the collision-attack, which means finding two inputs with the same output. Similarly, the implementation of quantum-based platforms will be a problem for the Hash algorithms. However, because SHA-2 (256-bits) and SHA-3 (384-bits) have quite longer outputs, they appear to remain quantum-resistant.
<urn:uuid:c90fffae-0e81-436e-a4ba-c91aea9eb979>
CC-MAIN-2021-04
https://content.hsm.utimaco.com/blog/state-of-symmetric-hash-algorithms-after-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703514423.60/warc/CC-MAIN-20210118061434-20210118091434-00673.warc.gz
en
0.918635
1,282
3.53125
4
Will Fusion Energy Power The Future? Fusion Energy: The Next Power Revolution? Fusion refers to a form of energy generation. It’s sometimes known as nuclear fusion, and it’s the opposite of nuclear fission (which powers nuclear reactors). Fusion energy holds a great deal of promise for humanity. While nuclear fission produces energy by splitting an atom, fusion reactors fuse two light atoms into an heavier one. This release huge amounts of energy. Fusion energy is what powers the sun and other stars. Essentially, stars are actually huge fusion reactors. Fusion Energy: How It Works Despite what you learned in high school science classes, there are actually four fundamental types of matter. In addition to solids, liquids, and gases, there is also plasma. But, since plasma doesn’t occur naturally on Earth, we don’t often consider it. Plasma is a special electrically-charged, or ionized, gas. Plasma behaves differently than other types of matter, although the mechanics can be hard to understand. For our purposes, it’s enough to know that plasma only occurs at low pressures and high temperatures. Usually, it’s created through the use of electromagnetic fields. Fusion energy is created when special forms of hydrogen, know as deuterium and tritium, are heated to enormously hot temperatures. The only way to contain the plasma so far as we know is to use magnetic fields. Unfortunately, scientists are struggling to create a reactor that works. Current versions can’t hold the plasma for an extended period of time at sufficient temperature and size. Which leads us to… Fusion Energy: The Challenges Fusion energy is the type of technology that is still faced with certain engineering barrier, much like the space elevator. Currently, every type of fusion reactor design ends up having a negative energy balance. In short, this means that the reactor takes more energy to run then it produces. This is because fusion plasma doesn’t like to be contained. We have to contain them at high pressures and temperatures over long enough period of time to actually generate energy. Otherwise the massive amount of energy spent making the plasma doesn’t get earned back. Stars, like the sun, are able to contain plasma with their massive gravity. That’s why massive plasma flares on the surface of the sun end up being subsumed back into the sun. Germany recently brought online it’s Wendelstein X-7 Stellarator. This is expected to run for up to 30 minutes at a time, which would blow the current record of 102 seconds away. But the device in Germany was never intended to be net-positive on energy. Rather, it was meant to show proof of concept of fusion energy. So faced with all these challenges, why do we still pursue fusion? Fusion Energy: Applications Because in many ways, fusion would be an almost perfect source of energy. The fuel itself, which is mostly deuterium, is found abundantly in Earth’s ocean. One out of every 6,500 hydrogen atoms in the ocean is deuterium. While that may not seem like a lot, there are literally trillions upon trillions of these atoms in the ocean. Furthermore, fusion produced so much more energy than other sources that not much would be needed to generate massive amounts of power. And even though fusion energy isn’t technically renewable, it has many of its benefits. It emits no air pollution or greenhouse gas and much less radiation than fission. Unlike other renewable sources, it’s not dependent on weather or location either. Because of this it won’t suffer from either diseconomies of scale or power interruption. And since solar energy isn’t available in interstellar space, fusion is a possible enabling technology for space-faring. Building the Fusion Energy Reactor The most promising solution to the problems facing fusion energy is the tokamak. Tokamak originally comes from a complicated Russian acronym, so sticking with tokamak is fine. The word is pretty awesome on its own. Essentially, the tokamak is a device that harnesses a powerful magnetic field to confine plasma in a torus. A torus is a fancy geometry term for a donut. You can see a torus in the picture above. Tokamak technology is what is being used in ITER, discussed below. However, recently a group of scientists say they think they’ve found a different solution. They designed a bizarre spherical reactor that could theoretically achieve net-positive nuclear fusion. This could be the key to commercially available fusion energy. Their reactor would use hydrogen and boron instead. It also leverages lasers to “heat up the core to 200 times hotter than the center of the Sun.” The team that released the study believes it could be built sooner than any current design. In fact ,they think that pending any unexpected engineering issues, such a reactor could be built within a decade. It also pointed out that their process would produce no radioactive waste at all. Is Fusion the Future? That remains to be decided. Like the space elevator, we have some significant technological hurdles. But given the possibilities, it’s likely that scientists and governments will keep trying. While the laser project discussed above is interesting, it’s still mostly conceptual. ITER, or the International Thermonuclear Experimental Reactor, is the largest current nuclear fission project. It’s an international engineering megaproject. When it’s completed, it will be the world’s first fully functioning fusion reactor. It’s being supported by more than thirty-five nations including the USA, China, India, Japan and Russia. Construction began in earnest in 2008 in France. Currently, the facility plans to complete construction by 2021. It will then aim to achieve plasma by 2025 and be operational by 2035. Interestingly, scientists and governments are already planning the next generation project. DEMO, or DEMOnstratio Power Station, will build upon the ITER experimental fusion reactor. It will be the link between ITER and true commercially available fusion power. The hope is that the DEMO system will be operational before 2050. Like other ambitious project – such as the Human Genome Project – the process will become cheaper and more efficient as time goes on. Because technology is designed, improved, and refined during the experimental stage, each subsequent experiment is easier. Clean, cheap, plentiful energy could take human civilization to the next stage. It might even be one of the final steps before we can become a spacefaring species. - Discover The Incredible Life Cycle Of A Star - April 17, 2018 - Will Fusion Energy Power The Future? - February 16, 2018 - Immersive Experience Technology: The Future of VR? - February 14, 2018 - Why Is Everyone Talking About the Fermi Paradox? - February 9, 2018 - Could A Space Elevator Be Coming Soon? - February 7, 2018 - What is Emergence? Ask the Ants - February 2, 2018 - The Modern Day Supercomputer - January 31, 2018 - Quantum Entanglement: Emerging Tech - January 26, 2018 - Extreme Weather: Bomb Cyclone - January 19, 2018 - Helping to Define Spectre and Meltdown - January 18, 2018
<urn:uuid:ecd5539f-575d-402d-921c-a7126d531d9f>
CC-MAIN-2021-04
https://coolkidproblems.com/fusion-energy/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703531335.42/warc/CC-MAIN-20210122175527-20210122205527-00473.warc.gz
en
0.93109
1,526
3.640625
4
For the first time, scientists have demonstrated laser communications between a microsatellite and a ground station while using the quantum nature of photons to secure the data being transmitted. The work, carried out by researchers at the National Institute of Information and Communications Technology (NICT) in Japan and recently published in the journal Nature Photonics, demonstrates an "unhackable" quantum communications technology known as Quantum Key Distribution, or QKD. As the world edges closer to quantum computing, current methods of securing transmitted data may be rendered obsolete so a new method to secure data will be required, the researchers argue. [Twisted Physics: 7 Mind-Blowing Findings] "The main advantage [of QKD] is the unconditional security," team leader Alberto Carrasco-Casado told Space.com. "When quantum computers are developed, the security of conventional communications will be compromised, since current cryptography is based only on computational complexity. "The development of practical quantum computers is only a matter of time, which has made quantum communication a hot topic in the last few years, and the tendency is foreseen to increase in the future," he added. QKD is a very attractive method to totally secure communications. By recording data in the quantum states of individual photons, the process ensures that should the signal be intercepted, the quantum states will change — causing the recipient of the signal to be alerted of the breach. This is a basic tenet of quantum mechanics, based on Heisenberg's uncertainty principle; one cannot simply observe a quantum particle (in this case a photon) without irrevocably changing that particle's quantum state. With QKD, a secret key is shared between the transmitter and receiver. If a hacker tries to decode the signal as it travels from one to the other, the signal itself changes on a quantum level. So the system detects the hacking event, the secret key is discarded and the signal is broken, preventing the hack from continuing. To demonstrate this secured, high-capacity transmission of data between an Earth-based station and a satellite in low-Earth orbit (LEO), Carrasco-Casado's team used the quantum-communication transmitter, called SOTA (Small Optical TrAnsponder), on board the microsatellite SOCRATES (Space Optical Communications Research Advanced Technology Satellite) that was launched by the Japan Aerospace Exploration Agency (JAXA) in 2014. Weighing only 13 lbs. (6 kilograms), SOTA is the smallest quantum communications transmitter ever tested. Orbiting above Earth at 372 miles (600 kilometers), SOCRATES was traveling at over 15,000 mph (7 kilometers per second) when SOTA would establish contact with a 1-meter telescope located in Tokyo's Koganei city. The received signal was then guided to a quantum receiver to decode the information using a QKD protocol, the researchers wrote in their study. SOTA encoded the individual photons with 1 bit of data — either a "1" or a "0" — achieved by switching the photons between two polarized states — a method known as a "single-photon regime." SOTA then beamed pulses of laser at a rate of 10 million bits per second. On reaching the ground station, the laser signal was extremely weak (the researchers say that, on average, only 0.1 laser photons were received per pulse), but the quantum receiver was still able to detect the signal and decode the information over a low level of noise. Now that the technology has been demonstrated using a microsatellite, Carrasco-Casado said he is thinking about future applications. "Maybe the most exciting application would be applying QKD to satellite constellations," he said. "Several constellations are now being considered with a huge number of satellites … SpaceX constellation plans to use over 4,000 satellites. If QKD can be miniaturized following the heritage of SOTA, this technology could be spread massively, enabling a truly-secure global communication network." The SOCRATES/SOTA mission ended in September 2016 after the satellite failed, Carrasco-Casado added, but the experiment had more than doubled the originally designed mission duration of over a year. "We are working on other future missions that will leverage the expertise and knowledge acquired with the SOCRATES/SOTA mission," he said. There is international interest in quantum communications, with research being carried out in Japan, China, Europe, Canada and the U.S. A Chinese research team recently announced the successful quantum teleportation of individual photons from a ground station to the orbiting satellite Micius, which was launched by China last year. Like QKD, teleportation is a form of quantum communications. Teleportation involves the production of two quantum particles that form at the same time, in the same place — the two particles share the same quantum states, and measurements on one impact the other instantaneously. They are "entangled." In the Chinese experiment, pairs of entangled photons were produced on the ground, and some of the photons were transmitted to the ultra-sensitive receiver on Micius orbiting overhead. When the photons were received by the satellite, the researchers were able to confirm entanglement with those on the ground by "teleporting" the quantum state of a photon between the two, over hundreds of miles. "This work establishes the first ground-to-satellite up-link for faithful and ultra-long-distance quantum teleportation, an essential step toward global-scale quantum internet," the researchers wrote in their study, which was posted on the ArXiv preprint repository. As low-Earth orbit becomes more crowded, competition for the shrinking availability of radio frequency (RF) bands will eventually create a communications bottleneck, Carrasco-Casado's team said in a statement, so quantum communications solutions using laser technology will be needed not only to transmit data secured against hacking attempts, but also to send much larger quantities of data in a smaller space, as laser transmission allows.
<urn:uuid:11043944-ecc8-4d7d-90d8-98f7c47b590a>
CC-MAIN-2021-04
https://www.space.com/37622-quantum-communications-microsatellite-to-earth.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703518201.29/warc/CC-MAIN-20210119072933-20210119102933-00273.warc.gz
en
0.943908
1,227
3.546875
4
In the world of computers, silicon is king. The semiconducting element forms regular, near-perfect crystals into which chipmakers can carve the hundreds of millions of features that make the microchips that power the processors. Technological improvements let chipmakers cut the size of those features in half every 18 months-a feat known as Moore’s law, after Intel cofounder Gordon Moore. Today, that size hovers around 180 nanometers (180 billionths of a meter), and researchers expect to push below 50 nanometers within a decade. But that’s about as far as silicon can go: below that quantum physics makes electrons too unruly to stay inside the lines. If computers are to keep up with Moore’s law, they will have to move beyond silicon. After a couple of decades of theorizing, computer scientists, bioengineers and chemists in the mid-1990s began lab experiments seeking alternative materials for future CPUs and memory chips. Today, their research falls into three broad categories: quantum, molecular and biological computing. In the field of quantum computing, researchers seek to harness the quantum effects that will be silicon’s undoing. Scientists succeeded in making rudimentary logic gates out of molecules, atoms and sub-atomic particles such as electrons. And incredibly, other teams have discovered ways to perform simple calculations using DNA strands or microorganisms that group and modify themselves. Molecular Building Blocks In one type of molecular computing (or nanocomputing), joint teams at Hewlett Packard Co. and UCLA sandwich complex organic molecules between metal electrodes coursing through a silicon substrate. The molecules orient themselves on the wires and act as switches. Another team at Rice and Yale universities has identified other molecules with similar properties. Normally, the molecules won’t let electrons pass through to the electrodes, so a quantum property called tunneling, long used in electronics, is manipulated with an electric current to force the electrons through at the proper rate. If researchers can figure out how to lay down billions of these communicating molecules, they’ll be able to build programmable memory and CPU logic that is potentially millions of times more powerful than in today’s computers. Molecular researchers like the HP/UCLA team, however, face a challenge in miniaturizing their current wiring technology-nanowires made from silicon strands-from several hundred to approximately 10 nanometers. Carbon nanotubes are promising substitutes. The rigid pipes make excellent conductors, but scientists must figure out how to wrangle them into the latticework needed for complex circuitry. “We’ve shown that the switching works,” says HP computer architect Philip Kuekes. “But there is still not as good an understanding of the basic mechanism so that an engineer can design with it.” Hewlett Packard and UCLA have jointly patented several techniques for manufacturing of molecular computers, most recently in January of 2002. Although molecular circuits employ some quantum effects, a separate but related community of scientists is exploring the possibilities of quantum computing-computing with atoms and their component parts. It works from the notion that some aspect of a sub-atomic particle-say, the location of an electron’s orbit around a nucleus-can be used to represent the 1s and 0s of computers. As with molecules, these states can be manipulated-programmed, in effect. One approach pursued by members of a national consortium involving Berkeley, Harvard, IBM, MIT and others, involves flipping the direction of a spinning electron to turn switches on or off. By applying electromagnetic radiation in a process called nuclear magnetic resonance (NMR) like that used in medical imaging, researchers can control the spin of the carbon and hydrogen nuclei in chloroform. Alternatively, filters and mirrors show promise for controlling photons’ light as a switching mechanism. Other researchers work with materials such as quantum “dots” (electrons in silicon crystal), and “ion traps” (ionized atoms suspended in an electrical field). Quantum bits (qubits) have an unusual quality that makes them a double-edge sword for computing purposes, though. Due to the lack of determinism inherent in quantum mechanics, qubits can be on or off simultaneously, a phenomenon called superposition. This makes it harder to force qubits into digital lockstep, but it also multiplies exponentially the amount of information groups of qubits can store. It theoretically allows massively parallel computation to solve problems previously thought uncomputable, such as factoring large prime numbers. One implication: today’s encryption techniques depend on the unfeasibility of computing the two multipliers (factors) of certain numbers, so quantum computers may one day be able to crack most encrypted files that exist today. This possibility has given the research a boost from government agencies, including the National Security Agency. To be manufacturable, quantum computers will require billions of such sub-atomic switches working together and interacting with their environments without falling into a disorganized state called decoherence. A quantum state called entanglement-where many atoms are made to behave exactly alike-provides one possible solution. Researchers also hope to fight decoherence by harnessing a phenomenon called interference, that is, the overlapping of quantum particles’ wavelike energy. Getting Down to the Biology In addition to molecular and quantum computing, a third approach, biological computing, relies on living mechanism to perform logic operations. Bioengineers have long understood how to manipulate genes to function as switches that activate other genes. Now they’re using the technique to build rudimentary computer “clocks” and logic gates inside bacteria such as E. coli. Other researchers use genes to prod microorganisms into states that represent information. A team headed by Thomas Knight at the MIT Artificial Intelligence Laboratory genetically manipulates luciferase, an enzyme in luminescent creatures such as fireflies, to generate light that serves as a medium of cell-to-cell communication. One of biological computing’s biggest challenges is calculating with elements that are flawed, unreliable and decentralized. To that end, Knight’s amorphous computing group studies ways to encourage bacteria to organize themselves into parallel-processing computers. “I don’t think of it as likely to be the path to making conventional computers,” Knight says. “It will be the way in which we build the molecular-scale computers.” Molecular computers face similar reliability challenges. At HP, researchers used fault-tolerant algorithms to construct a silicon-based computer called Teramac that worked despite having 220,000 defects. Kuekes, Teramac’s project manager, says the company is now exploring ways to translate what they’ve learned to molecular computing. Farther out on the biological curve is DNA computing, which attempts to exploit the way DNA strands recognize each other and combine into structures that could perform large, compute-intensive calculations in parallel. Few in the biological community expect biocomputers to replace the general-purpose silicon computer. They hope instead to manufacture molecular computers cheaply and efficiently with organisms that can orient themselves into logic circuits or transform vats of chemicals to manufacture other chemicals. Still more exciting possibilities come from the potential of special-purpose biological computers to interact with other biological systems. Miniature computers could be injected into living tissue to reprogram cancer-causing genes, for example, or administer insulin shots. For now, all these applications loom distant on the horizon. But researchers agree that silicon’s days are numbered, and that radical new approaches will be needed to keep computers zooming through the 21st century.
<urn:uuid:90bee457-8394-43b4-ad3d-bdeec0f0dfc8>
CC-MAIN-2021-04
https://www.technologyreview.com/2002/01/28/41130/the-future-of-cpus-in-brief/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704803308.89/warc/CC-MAIN-20210126170854-20210126200854-00474.warc.gz
en
0.92071
1,571
4
4
Prototype device enables photon-photon interactions at room temperature for quantum computing Ordinarily, light particles—photons—don't interact. If two photons collide in a vacuum, they simply pass through each other. An efficient way to make photons interact could open new prospects for both classical optics and quantum computing, an experimental technology that promises large speedups on some types of calculations. In recent years, physicists have enabled photon-photon interactions using atoms of rare elements cooled to very low temperatures. But in the latest issue of Physical Review Letters, MIT researchers describe a new technique for enabling photon-photon interactions at room temperature, using a silicon crystal with distinctive patterns etched into it. In physics jargon, the crystal introduces "nonlinearities" into the transmission of an optical signal. "All of these approaches that had atoms or atom-like particles require low temperatures and work over a narrow frequency band," says Dirk Englund, an associate professor of electrical engineering and computer science at MIT and senior author on the new paper. "It's been a holy grail to come up with methods to realize single-photon-level nonlinearities at room temperature under ambient conditions." Joining Englund on the paper are Hyeongrak Choi, a graduate student in electrical engineering and computer science, and Mikkel Heuck, who was a postdoc in Englund's lab when the work was done and is now at the Technical University of Denmark. Quantum computers harness a strange physical property called "superposition," in which a quantum particle can be said to inhabit two contradictory states at the same time. The spin, or magnetic orientation, of an electron, for instance, could be both up and down at the same time; the polarization of a photon could be both vertical and horizontal. If a string of quantum bits—or qubits, the quantum analog of the bits in a classical computer—is in superposition, it can, in some sense, canvass multiple solutions to the same problem simultaneously, which is why quantum computers promise speedups. Most experimental qubits use ions trapped in oscillating magnetic fields, superconducting circuits, or—like Englund's own research—defects in the crystal structure of diamonds. With all these technologies, however, superpositions are difficult to maintain. Because photons aren't very susceptible to interactions with the environment, they're great at maintaining superposition; but for the same reason, they're difficult to control. And quantum computing depends on the ability to send control signals to the qubits. That's where the MIT researchers' new work comes in. If a single photon enters their device, it will pass through unimpeded. But if two photons—in the right quantum states—try to enter the device, they'll be reflected back. The quantum state of one of the photons can thus be thought of as controlling the quantum state of the other. And quantum information theory has established that simple quantum "gates" of this type are all that is necessary to build a universal quantum computer. The researchers' device consists of a long, narrow, rectangular silicon crystal with regularly spaced holes etched into it. The holes are widest at the ends of the rectangle, and they narrow toward its center. Connecting the two middle holes is an even narrower channel, and at its center, on opposite sides, are two sharp concentric tips. The pattern of holes temporarily traps light in the device, and the concentric tips concentrate the electric field of the trapped light. The researchers prototyped the device and showed that it both confined light and concentrated the light's electric field to the degree predicted by their theoretical models. But turning the device into a quantum gate would require another component, a dielectric sandwiched between the tips. (A dielectric is a material that is ordinarily electrically insulating but will become polarized—all its positive and negative charges will align in the same direction—when exposed to an electric field.) When a light wave passes close to a dielectric, its electric field will slightly displace the electrons of the dielectric's atoms. When the electrons spring back, they wobble, like a child's swing when it's pushed too hard. This is the nonlinearity that the researchers' system exploits. The size and spacing of the holes in the device are tailored to a specific light frequency—the device's "resonance frequency." But the nonlinear wobbling of the dielectric's electrons should shift that frequency. Ordinarily, that shift is mild enough to be negligible. But because the sharp tips in the researchers' device concentrate the electric fields of entering photons, they also exaggerate the shift. A single photon could still get through the device. But if two photons attempted to enter it, the shift would be so dramatic that they'd be repulsed. The device can be configured so that the dramatic shift in resonance frequency occurs only if the photons attempting to enter it have particular quantum properties—specific combinations of polarization or phase, for instance. The quantum state of one photon could thus determine the way in which the other photon is handled, the basic requirement for a quantum gate. Englund emphasizes that the new research will not yield a working quantum computer in the immediate future. Too often, light entering the prototype is still either scattered or absorbed, and the quantum states of the photons can become slightly distorted. But other applications may be more feasible in the near term. For instance, a version of the device could provide a reliable source of single photons, which would greatly abet a range of research in quantum information science and communications. "This work is quite remarkable and unique because it shows strong light-matter interaction, localization of light, and relatively long-time storage of photons at such a tiny scale in a semiconductor," says Mohammad Soltani, a nanophotonics researcher in Raytheon BBN Technologies' Quantum Information Processing Group. "It can enable things that were questionable before, like nonlinear single-photon gates for quantum information. It works at room temperature, it's solid-state, and it's compatible with semiconductor manufacturing. This work is among the most promising to date for practical devices, such as quantum information devices." This story is republished courtesy of MIT News (web.mit.edu/newsoffice/), a popular site that covers news about MIT research, innovation and teaching.
<urn:uuid:1947ccc8-4c8c-4813-9abc-6029c036b3b6>
CC-MAIN-2021-04
https://phys.org/news/2017-06-prototype-device-enables-photon-photon-interactions.html
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703519843.24/warc/CC-MAIN-20210119232006-20210120022006-00274.warc.gz
en
0.925981
1,314
3.53125
4
Complex oxides are compounds that contain oxygen and at least two other elements. These materials exhibit the unusual electric and magnetic properties needed for next-generation electronic devices. Because silicon is the dominant electronic material, any promising complex oxide should be capable of interfacing with it. However, achieving this interface is challenging. For instance, lead zirconium titanate (PZT) is a well-known complex oxide that is strongly ferroelectric, but it fails to properly “grow” on silicon. One solution is to form thin-film PZT on a compatible substrate and then transfer it to silicon. While conceptually straightforward, the effects of such transfers on thin films are largely unknown. In order to resolve this mystery a research team investigated the properties of a transferred thin film using several techniques, including scanning probe microscopy and charge-voltage relationship measurement performed at Argonne National Laboratory's Material Science Division (MSD), and x-ray nanodiffraction experimentation carried out at the U.S. Department of Energy’s Advanced Photon Source (APS) and Center for Nanoscale Materials (CNM), also at Argonne. The researchers found that the static ferroelectric surface charge and structural properties of the transferred PZT film were more-or-less preserved. However, the ferroelectric’s dynamic electromechanical response changed substantially. Taken together, these findings, published in the journal Advanced Materials, demonstrate the feasibility of transferring thin-film PZT and other complex oxides to silicon, thereby promoting their use for applications including non-volatile computer memory and quantum computing. PZT was developed in the 1950s as a piezoelectric compound, meaning it produces electricity when deformed, and changes shape in response to an electric field. Due to its excellent piezoelectric properties, PZT is widely used in ultrasound transducers and actuators. This widely-used material is also ferroelectric, meaning that positive and negative charge separation spontaneously arises. Interest in ferroelectrics has increased due to their potential use in computational, switching, and sensor applications. But the advantages offered by ferroelectrics depend upon integration with silicon. Due to their mismatched crystalline structures, PZT will not correctly form on silicon. Fortunately, the lead author of this work previously (while doing postdoctoral research at UC Berkeley) developed an alternative approach known as layer transfer technique (LTT). Using LTT allows scientists to form a thin-film complex oxide on a highly-compatible substrate and then transfer the film to another substrate. For this study, the researchers from Argonne, the Korea Advanced Institute of Science and Technology (KAIST), and the University of California, Berkeley used pulsed laser deposition to form a crystalline layer of PZT on one substrate and then moved it to another substrate. The researchers were interested in resolving several issues, including whether LTT could transfer a PZT film without destroying it; to provide the first-ever look at the underside of a complex oxide thin-film; and to determine how the transfer affected the film's properties. Figure 1 illustrates the experimental concept. Thin-film PZT (Fig. 1a) is extracted from the substrate (Fig. 1b) and placed upside-down on a similar substrate (Fig. 1c). This LTT procedure, performed in the CNM cleanroom facility, releases the molecular bonds between the PZT and its original substrate, resulting in a freestanding film resting on the second substrate. Scanning probe microscopy experiments and charge-voltage relationship measurement, performed at MSD, revealed drastically reduced dynamic ferroelectric properties in freestanding film. X-ray nanoprobe data, gathered at the joint CNM/X-ray Science Division 26-ID beamline at the APS, confirmed that the transferred film's crystalline structure remained intact. Polarized regions are created using scanning probe microscope in ferroelectric films. The nanoscale spaces between two oppositely-polarized areas are called domain walls. Upon applying an electric field, the intervening domain walls can rapidly shift. Taking snapshots of the position of a particular domain wall after applying pulsed electric fields revealed that domain wall movement was 100 to 1000 times slower in the freestanding film versus the originally-deposited form (Fig. 2a). The reduction of domain wall speed was unexpected since theory indicated this speed should actually increase in a strain-free, freestanding film. The researchers attributed the dramatic reduction in wall speed to the induced flexoelectric fields within the film that altered its polarization landscape. The presence of such flexoelectric fields was confirmed by capacitance measurements and numerical simulations. The induced flexoelectric field arose from the pronounced crystallographic tilts caused by thin-film separation, as revealed by the contact mode scanning probe microscopy and x-ray data. Although wall speed was lowered in the freestanding film, its polarization strength was little changed. The fact that the crystallographic structure and important ferroelectric properties (polarization strength, etc.) were largely preserved in the freestanding PZT film indicates that integrating thin films of complex oxides with silicon is entirely feasible using LTT. However, the researchers note that effects arising from the flexoelectric fields will require additional investigation. ― Philip Koth See: Saidur R. Bakaul1*, Jaegyu Kim2, Seungbum Hong2, Mathew J. Cherukara1, Tao Zhou1, Liliana Stan1, Claudy R. Serrao3, Sayeef Salahuddin3, Amanda K. Petford-Long1, Dillon D. Fong1, and Martin V. Holt1, “Ferroelectric Domain Wall Motion in Freestanding Single-Crystal Complex Oxide Thin Film,” Adv. Mater. 32, 1907036 (2020). DOI: 10.1002/adma.201907036 Author affiliations: 1Argonne National Laboratory, 2Korea Advanced Institute of Science and Technology (KAIST), 3University of California, Berkeley Scanning probe microscopy, electronic transport, and sample fabrication carried out at Argonne National Laboratory were supported by the U.S. Department of Energy (DOE) Office of Science-Basic Energy Sciences, Materials Sciences and Engineering Division. Use of the Center for Nanoscale Materials was supported by the U.S. DOE Office of Science-Basic Energy Sciences, under contract No. DE-AC02- 06CH11357. Materials growth carried out at the University of California Berkeley was supported by Office of Naval Research Contract No: N00014-14-1-0654. J.K. and S.H. acknowledge support from Brain Korea 21 Plus and KAIST. This research used resources of the Advanced Photon Source, a U.S. DOE Office of Science User Facility operated for the DOE Office of Science by Argonne National Laboratory under Contract No. DE-AC02-06CH11357. The U.S. Department of Energy's (DOE) APS is one of the world’s most productive x-ray light source facilities. Each year, the APS provides high-brightness x-ray beams to a diverse community of more than 5,000 researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. Researchers using the APS produce over 2,000 publications each year detailing impactful discoveries, and solve more vital biological protein structures than users of any other x-ray light source research facility. APS x-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. The Center for Nanoscale Materials is one of the five DOE Nanoscale Science Research Centers, premier national user facilities for interdisciplinary research at the nanoscale supported by the DOE Office of Science. Together the NSRCs comprise a suite of complementary facilities that provide researchers with state-of-the-art capabilities to fabricate, process, characterize and model nanoscale materials, and constitute the largest infrastructure investment of the National Nanotechnology Initiative. The NSRCs are located at DOE’s Argonne, Brookhaven, Lawrence Berkeley, Oak Ridge, Sandia and Los Alamos National Laboratories. For more information about the DOE NSRCs, please visit https://science.osti.gov/User-Facilities/User-Facilities-at-a-Glance. Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC, for the U.S. DOE Office of Science. The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.
<urn:uuid:5097eb84-68ae-471e-95ab-b08f59f1986d>
CC-MAIN-2021-04
https://www.aps.anl.gov/APS-Science-Highlight/2020-12-14/ferroelectric-domain-wall-movement-in-a-complex-oxide-thin-film
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703565541.79/warc/CC-MAIN-20210125092143-20210125122143-00476.warc.gz
en
0.90319
1,970
3.875
4
Try a quick experiment: Take two flashlights into a dark room and shine them so that their light beams cross. Notice anything peculiar? The rather anticlimactic answer is, probably not. That’s because the individual photons that make up light do not interact. Instead, they simply pass each other by, like indifferent spirits in the night. But what if light particles could be made to interact, attracting and repelling each other like atoms in ordinary matter? One tantalizing, albeit sci-fi possibility: light sabers – beams of light that can pull and push on each other, making for dazzling, epic confrontations. Or, in a more likely scenario, two beams of light could meet and merge into one single, luminous stream. It may seem like such optical behavior would require bending the rules of physics, but in fact, scientists at MIT, Harvard University, and elsewhere have now demonstrated that photons can indeed be made to interact – an accomplishment that could open a path toward using photons in quantum computing, if not in light sabers. In a paper published today in the journal Science, the team, led by Vladan Vuletic, the Lester Wolfe Professor of Physics at MIT, and Professor Mikhail Lukin from Harvard University, reports that it has observed groups of three photons interacting and, in effect, sticking together to form a completely new kind of photonic matter. In controlled experiments, the researchers found that when they shone a very weak laser beam through a dense cloud of ultracold rubidium atoms, rather than exiting the cloud as single, randomly spaced photons, the photons bound together in pairs or triplets, suggesting some kind of interaction – in this case, attraction – taking place among them. While photons normally have no mass and travel at 300,000 kilometers per second (the speed of light), the researchers found that the bound photons actually acquired a fraction of an electron’s mass. These newly weighed-down light particles were also relatively sluggish, traveling about 100,000 times slower than normal noninteracting photons. Vuletic says the results demonstrate that photons can indeed attract, or entangle each other. If they can be made to interact in other ways, photons may be harnessed to perform extremely fast, incredibly complex quantum computations. “The interaction of individual photons has been a very long dream for decades,” Vuletic says. Vuletic’s co-authors include Qi-Yung Liang, Sergio Cantu, and Travis Nicholson from MIT, Lukin and Aditya Venkatramani of Harvard, Michael Gullans and Alexey Gorshkov of the University of Maryland, Jeff Thompson from Princeton University, and Cheng Ching of the University of Chicago. Biggering and biggering Vuletic and Lukin lead the MIT-Harvard Center for Ultracold Atoms, and together they have been looking for ways, both theoretical and experimental, to encourage interactions between photons. In 2013, the effort paid off, as the team observed pairs of photons interacting and binding together for the first time, creating an entirely new state of matter. In their new work, the researchers wondered whether interactions could take place between not only two photons, but more. “For example, you can combine oxygen molecules to form O2 and O3 (ozone), but not O4, and for some molecules you can’t form even a three-particle molecule,” Vuletic says. “So it was an open question: Can you add more photons to a molecule to make bigger and bigger things?” To find out, the team used the same experimental approach they used to observe two-photon interactions. The process begins with cooling a cloud of rubidium atoms to ultracold temperatures, just a millionth of a degree above absolute zero. Cooling the atoms slows them to a near standstill. Through this cloud of immobilized atoms, the researchers then shine a very weak laser beam – so weak, in fact, that only a handful of photons travel through the cloud at any one time. The researchers then measure the photons as they come out the other side of the atom cloud. In the new experiment, they found that the photons streamed out as pairs and triplets, rather than exiting the cloud at random intervals, as single photons having nothing to do with each other. In addition to tracking the number and rate of photons, the team measured the phase of photons, before and after traveling through the atom cloud. A photon’s phase indicates its frequency of oscillation. “The phase tells you how strongly they’re interacting, and the larger the phase, the stronger they are bound together,” Venkatramani explains. The team observed that as three-photon particles exited the atom cloud simultaneously, their phase was shifted compared to what it was when the photons didn’t interact at all, and was three times larger than the phase shift of two-photon molecules. “This means these photons are not just each of them independently interacting, but they’re all together interacting strongly.” The researchers then developed a hypothesis to explain what might have caused the photons to interact in the first place. Their model, based on physical principles, puts forth the following scenario: As a single photon moves through the cloud of rubidium atoms, it briefly lands on a nearby atom before skipping to another atom, like a bee flitting between flowers, until it reaches the other end. If another photon is simultaneously traveling through the cloud, it can also spend some time on a rubidium atom, forming a polariton – a hybrid that is part photon, part atom. Then two polaritons can interact with each other via their atomic component. At the edge of the cloud, the atoms remain where they are, while the photons exit, still bound together. The researchers found that this same phenomenon can occur with three photons, forming an even stronger bond than the interactions between two photons. “What was interesting was that these triplets formed at all,” Vuletic says. “It was also not known whether they would be equally, less, or more strongly bound compared with photon pairs.” The entire interaction within the atom cloud occurs over a millionth of a second. And it is this interaction that triggers photons to remain bound together, even after they’ve left the cloud. “What’s neat about this is, when photons go through the medium, anything that happens in the medium, they ‘remember’ when they get out,” Cantu says. This means that photons that have interacted with each other, in this case through an attraction between them, can be thought of as strongly correlated, or entangled – a key property for any quantum computing bit. “Photons can travel very fast over long distances, and people have been using light to transmit information, such as in optical fibers,” Vuletic says. “If photons can influence one another, then if you can entangle these photons, and we’ve done that, you can use them to distribute quantum information in an interesting and useful way.” Going forward, the team will look for ways to coerce other interactions such as repulsion, where photons may scatter off each other like billiard balls. “It’s completely novel in the sense that we don’t even know sometimes qualitatively what to expect,” Vuletic says. “With repulsion of photons, can they be such that they form a regular pattern, like a crystal of light? Or will something else happen? It’s very uncharted territory.” Massachusetts Institute of Technology Observation of three-photon bound states in a quantum nonlinear medium, Science (2018). science.sciencemag.org/cgi/doi … 1126/science.aao7293 Credit: Science (2018). 10.1126/science.aao7293
<urn:uuid:e688196d-4317-4fd1-8b1f-197b6336e41c>
CC-MAIN-2021-04
https://sciencebulletin.org/new-form-of-light-newly-observed-optical-state-could-enable-quantum-computing-with-photons/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703512342.19/warc/CC-MAIN-20210117112618-20210117142618-00477.warc.gz
en
0.94192
1,658
3.671875
4
Microwave photonics circuit elements will need to be similar to their RF analogs to provide the desired functionality. One of these analogous circuit elements is a terahertz microwave cavity resonator, which can be integrated onto an IC with standard CMOS processes. This is one of many circuit elements that can be placed on an IC and used to enable unique applications. These fibers will soon be integrated into semiconductor wafers as microwave lines to communicate with unique circuit elements like terahertz microcavity resonators. Microwave components have a lot more going on than what ends up in your microwave oven. Terahertz wave sources, detectors, and components have yet to be miniaturized, and the terahertz portion of the microwave spectrum is still largely unexplored. So far, the best we can do is get into the high GHz (low THz) region for oscillation, detection, and wave manipulation. This region is critical for many applications, including quantum computing, imaging, sensing, and ultra-fast communication. One fundamental set of components is terahertz microcavity resonators. These components are part of a larger photonics platform and they play analogous roles to RF resonators on a PCB. The simple geometry of these resonators also allows them to be placed on a chip alongside other photonic structures. If you’re a budding photonics engineer, keep reading to learn more about these resonator structures and how they might play a role in current and upcoming photonics systems. What Are Terahertz Microcavity Resonators? Much like any other resonator, terahertz microcavity resonators have a fundamental frequency that lies in the terahertz region. In terms of wavelength, a 1 THz wave in air has a wavelength of only 300 microns, which is quite large compared to today’s transistors. These structures provide the same function as well; they allow a wave matching the fundamental frequency or one of its harmonics to excite a high-Q resonance, whereby a standing wave can form in the cavity. Much like a wave on a string or in a waveguide, this standing wave at one of the eigenfrequencies will have very high intensity due to constructive interference inside the cavity. The very strong, very coherent electromagnetic wave in this structure can then be used for some other application. The challenges in working with these structures are wave generation and detection, both of which need to be solved for terahertz microcavity resonators to be useful at the chip level. Geometry and Eigenfrequencies The image below shows a simple rectangular terahertz microcavity resonator and its discrete eigenfrequency spectrum. The eigenfrequencies can be tuned to desired values by adjusting the geometry, just like any other resonator. The equation below applies to a closed rectangular cavity and provides a good first approximation for a slightly lossy cavity (i.e., with high dielectric constant contrast at the edge). Rectangular terahertz microcavity resonator geometry and eigenfrequencies. Although a rectangular geometry is shown above, more complex structures may be used for different applications. In a different structure (e.g., circular, hemispherical, or cylindrical) with an open edge, the eigenfrequencies may not obey such a simple equation. Instead, they may be determined from a dispersion relation that is a transcendental equation, which requires a numerical technique to extract specific frequencies. This is a well-known procedure for solving Sturm-Liouville problems in waveguides and resonators. If you have a much more complex structure that can’t be approximated as a simple shape, the various eigenfrequencies and the spatial distribution of the electromagnetic field can be determined using a 3D field solver (FDFD technique). A field solver you would normally use for IC packages can also be used for modeling terahertz microcavity resonators. Applications for terahertz microcavity resonators are still being researched, as are the device architectures required for different applications. Some proposed applications of terahertz microcavity resonators include: Sensing and imaging: High-Q terahertz microcavity resonators can be used for highly coherent imaging and sensing, with applications in molecular detection and biological imaging. Silicon photonics: While this application area is normally discussed in terms of SMF or MMF wavelengths, devices in this area can also operate at THz frequencies and will need terahertz microcavity resonators to act as filters and amplifiers. Communication: Currently, the world record for the highest data rate transmission belongs to an experimental wireless system operating at THz frequencies. Miniaturizing these systems at the chip level will require microcavity structures, including terahertz microcavity resonators. The important advancement provided by these structures is that they can occur on an integrated circuit. Today, these applications still involve large optical systems where an infrared mode comb in a femtosecond soliton laser is used to generate a terahertz wave through interference. Similarly, large systems are also used for the detection and manipulation of terahertz waves. Terahertz microcavity resonators are one class of components that can provide high-Q or low-Q reception of THz frequencies, which can then be passed to a detector element or other photonic circuit. The range of useful materials for building terahertz microcavity resonators, or for building coupling structures, is also an open research question. Some material platforms used for terahertz microcavity resonators include: Silicon: This material is the most promising for the fabrication of terahertz devices and their integration alongside other electronic circuits. GaAs, other III-V’s, and II-VI’s: This promising set of photonic materials has already shown interesting results at ~3 THz frequencies, particularly for the generation of laser light. This material platform is promising for photonics in general. Photonic crystals: Periodic nanostructures that are fabricated through chemical deposition methods provide a tunable platform for fabricating a range of terahertz devices, including terahertz microcavity resonators. Dielectrics: This broad range of materials includes oxides, salts, polymers, and other materials that can support transmission or absorption in various THz frequency ranges. For integration, the best set of materials should bond to the industry’s current range of semiconductors. Microcavity resonator materials should be chosen to integrate into existing semiconductor materials platforms and manufacturing processes. As your technology and designs push into more advanced spaces with the years to come, more advanced software that can navigate the nuances and challenges of THz components will be necessary. Be sure to prepare adequately as you stay ahead of the frequency curve. About the AuthorFollow on Linkedin Visit Website More Content by Cadence PCB Solutions
<urn:uuid:148dca17-5377-4095-b576-95bcec9a34f8>
CC-MAIN-2021-04
https://resources.pcb.cadence.com/blog/2020-todays-and-tomorrows-terahertz-microcavity-resonators
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703520883.15/warc/CC-MAIN-20210120120242-20210120150242-00076.warc.gz
en
0.888988
1,485
3.78125
4
Coursing through the fiber-optic veins of the Internet are photons of light that carry the fundamental bits of information. Depending on their intensity, these photons represent bits as 1s and 0s. This on-and-off representation of information is part of what physicists call “classical” phenomena. But photons of light have “quantum” properties as well, which, when exploited, provide more than simply a 1 or 0; these properties allow photons to represent 1s and 0s simultaneously. When information is approached from a quantum perspective, say scientists, encryption can be perfectly secure and enormous amounts of information can be processed at once. This field of quantum information –- the transmission and processing of data governed by quantum mechanics –- is rapidly moving beyond the lab and into the real world. Increasingly, researchers are conducting experiments within the same commercial fiber that transmits information in the classical way. For the most part, though, the two types of information have not intermingled: quantum information has been sent only over dedicated fiber. Now researchers at Northwestern University have shown that quantum information, in the form of “entangled photons,” can travel over the same fiber as classical signals. Additionally, the researchers have sent the combination signal through 100 kilometers of fiber – a record distance for entangled photons even without the classical signal. This marriage of quantum and classical optics shows that traditional optical tools can be used to send quantum encryption keys, based on entangled photons (some other schemes rely on single photons). In the future, this new technique might also enable long-distance networking between quantum computers, says Carl Williams, coordinator of the Quantum Information Program at the National Institute of Standards and Technology in Gaithersburg, MD. At the heart of the Northwestern experiment are the entangled photons: pairs of photons with interconnected properties. That is, looking at one photon in an entangled pair will reveal what the result of looking at the other photon would be – no matter how far apart the photons are. Entangled photons can be used in encryption by encoding information about a key in the photons. Then if an eavesdropper intercepts one photon of the entangled pair, the entire transmission is altered, alerting the code makers. Furthermore, entangled photons used for quantum computing could be split up and shared across a network of many quantum computers. Such photon pairs are “important whether the application is cryptography or anything else,” says Prem Kumar, professor of electrical and computer engineering and physics at Northwestern and lead scientist on the project. The first step in the experiment, then, was for the researchers to create entangled photons. Traditionally, shining laser light into a type of crystal has produced entangled photons. But it’s been difficult to use entangled photons made from crystals, because in transferring them into a fiber, you “lose the quality of the entanglement,” says Williams. Instead, Kumar’s team created their photon pairs by exploiting a similar, recently developed process that can occur within long lengths of standard fiber. The photons start in fiber and remain in it for the duration of the experiment, retaining their entanglement properties. The researchers pulsed polarized laser light through 300 meters of coiled fiber. It is this property of polarization (the orientation of the photons) that allows it to become entangled when the pairs of photons are created: if the polarization of one photon is measured, the polarization of the other photon is instantly known. Within the fiber, about one pair of polarization-entangled photons is created every microsecond, Kumar says, and the rate can be increased 100-fold by pulsing the light faster, he adds. Next, the entangled photons are split apart and each is directed into 50 kilometers of fiber (for a total of 100 kilometers), where they join a classical signal. At the opposite ends of the fibers, the photons are separated from the communication signals, and shoot towards two different photon detectors, built to see one photon at a time. Kumar says he knows he’s successfully sent entangled photons when both detectors see certain types of polarized photons at the same time. There are still challenges to using traditional fiber-optic cable and sending entangled photons 100 kilometers. Even the best quality commercial fiber has very small geometric inconsistencies, Kumar says, which can alter the polarization of the photon pairs slightly, decreasing the quality of entanglement – and rendering the quantum information useless. These slight changes in polarization can usually be adjusted for by sending the photons through special polarization devices right before they hit the detector, but it is difficult to know exactly how to adjust these devices to best compensate for the change in polarization. Interestingly, Kumar adds, the classical signal traveling with the quantum signal, as in the experiment, can help. It can track imperfections in the fiber encountered by the entangled photon, and relay this information so the polarization control device can be set to compensate appropriately. Right now, Kumar’s team is working on testing the distance limits of entangled photon transport and determining how many more classical signals they can add to the line and still retrieve the quantum information stored in the entangled photons. Because in real-world fiber optics, multiple signals pass through at once, it would be useful to know how many classical signals can share the fiber with a quantum signal. According to other scientists working in the field of quantum information, the fact that Kumar’s team has combined fiber-generated entangled photons with classical information, and sent the total signal over a record distance in a traditional fiber line is an exciting advance. “Pieces have been shown, but this puts it all together,” says Williams, who calls it “a remarkable demonstration.” Jeffrey Shapiro, professor of electrical engineering at MIT, says it is “great work…Prem [Kumar] works both on classical and quantum communication, and is one of the people who’s well suited to address both sides.” Ultimately, as quantum information matures, it will become more integrated into traditional fiber technology, says Kumar. “My goal is to make quantum optics applicable,” he notes. “Fiber-based quantum optics can piggyback on billions of dollars in optical communications technology. We want to ride that wave.”
<urn:uuid:03723859-81a4-45f8-b33b-8624747edeee>
CC-MAIN-2021-04
https://www.technologyreview.com/2006/04/13/229334/making-quantum-practical/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704833804.93/warc/CC-MAIN-20210127214413-20210128004413-00278.warc.gz
en
0.92706
1,282
3.9375
4
Atoms are tricky to control. They can zip around, or even tunnel out of their containment. In order for new precision measurement tools and quantum devices to work—and work well—scientists need to be able to control and manipulate atoms as precisely as possible. That’s especially true for optical atomic clocks. In these clocks, a cold, excited atom’s electrons swing back and forth in what’s called a dipole, vibrating like a plucked string. Scientists rapidly count those swings with a laser, dividing a second into quadrillionths of a second. However, even the best optical atomic clocks face decoherence—the atom falls back to its ground state, the laser loses the signal, and the clock winds down. This means optical atomic clocks can only take measurements for a few seconds before the atoms need to be “reset.” Scientists are continually exploring ways to increase those coherence times. Using optical tweezers, Aaron Young, along with other members of the Kaufman and Ye groups at JILA, have reached record-setting coherence times of more than half a minute. Their findings were recently published in Nature. “The trick is to use separate sets of tweezers to prepare and measure the atoms, and to hang on to the atoms while they ring down. This makes it possible to optimize the second set of tweezers to preserve coherence for as long as possible, without having to worry about competing requirements associated with other phases of the experiment,” Young said. Optical atomic clock technology Optical atomic clocks are incredibly varied, but there are two popular means for controlling the atoms: ion traps, and optical lattices for trapping neutral atoms. Each approach has its strengths and weaknesses. Trapped ion clocks measure the oscillations of a single charged atom, or ion. That atom is pristine, well-characterized, and well-controlled, however, due to the fundamental noise associated with quantum measurements, scientists need to run the trapped ion clock many times to obtain a precise measurement. Lattice clocks, on the other hand, use standing waves of reflected lasers to form an egg carton-shaped lattice that can hold many atoms. This way, they can interrogate many thousands of atoms in parallel to obtain precise measurements in a short amount of time. But it’s difficult to control any of those thousands of atoms individually, and interactions between these atoms must be well-characterized — a rich and complicated endeavor in its own right. Controlling and preventing these interactions is where optical tweezers come in. Optical tweezers are highly-focused laser beams capable of grabbing and moving individual atoms—something the Kaufman Group has a lot of experience doing. “With the tweezers, our traps are more or less independent,” Young said. “It gives you a lot of control over what kind of traps you can make.” The group uses this extra control to preserve quantum coherence, and minimize many of the effects that can limit clocks. A hybrid clock of cigar pancakes Young and the team used lasers to create a vertical lattice of traps, like stacked pancakes. The optical tweezers pierce these pancakes, looking like little cigar-shaped tubes. This creates a two-dimensional array composed of hundreds of spherical traps that each contain a single atom. This pancake-cigar architecture allows for very quick cooling and trapping of the atoms, at which point they are easily transferred to a second set of tweezers designed specifically for clock physics. Because the atoms are well-chilled, the second set of tweezers can make very shallow traps for the clock. Shallow traps minimize the number of photons that could interfere with the atoms, and they reduce the power required for the laser, making it possible to make more traps, and trap more atoms. They can also space these traps far enough apart so the atoms cannot move around or crash into their neighbors. All of this results in record coherence times—48 seconds. To put that in perspective, if every oscillation took about a full second—like the pendulum on a grandfather clock—you would only have to wind this clock once every few billion years. “This long lifetime is related to what people call a ‘quality factor’ – it’s the number of times an oscillator swings before it rings down. The quality factor of our experiment is the highest we know of in pretty much any physical system, including, depending on how you compare them, various astronomical systems like spinning neutron stars or planetary orbits,” Young said. More than a clock “What we’ve effectively done is put 150 very coherent qubits in the same place, which serves as a really good starting point for engineering interactions,” Young said. A clock with controllable interactions could be used to engineer quantum states that allow for even more precise measurements of time. But the Kaufman and Ye Groups see potential to use this technique for another quantum device: quantum computers. With exquisite control of each high-coherence atom, the atoms can act as a qubit for the computer to perform calculations. Young and Kaufman also see this as a “zeroth order step” in physics research. Physicists are continually seeking better control over atoms to manipulate interactions between them, and study the results—and this hybrid tweezer clock is a promising means of achieving that control for long periods of time. By studying and controlling those interactions, physicists can better understand how the quantum world works, and those discoveries could lead to new advances in quantum-based technologies. Their study was published in Nature on December 17th, 2020 and was supported by a National Science Foundation Physics Frontier Center grant and a grant from the National Research Council.
<urn:uuid:dbe64a79-0dc6-414d-bc29-fb4c51df3e71>
CC-MAIN-2021-04
https://jila.colorado.edu/news-events/articles/tweezing-new-kind-atomic-clock
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704847953.98/warc/CC-MAIN-20210128134124-20210128164124-00280.warc.gz
en
0.941953
1,198
3.78125
4
Our connected devices are hiding a big secret. They use energy—a lot of it. Every time you use your phone, your computer, or your smart TV to access the internet, you’re sending data requests to warehouse-sized buildings around the world, full of hundreds of thousands of servers. These data centers are among the most energy-intensive systems on the planet, representing approximately 10% of global electricity generation (though more conservative estimates put it at 3%). Yet we’re still blindly making classic computers—and they’re getting bigger and even more energy dense. China is home to the most energy-intensive supercomputer in the world, the Tianhe-2 in Guangzhou. This machine uses about 18 megawatts of power, and is expected to be succeeded by the exascale Tianhe-3, which will only further increase this extraordinary level of energy consumption. For reference, the average hydroelectric dam in the US produces approximately 36 megawatts of power. This is just one reason why quantum computing is key to the future. In addition to holding the potential to solve some of the world’s most computationally challenging problems, quantum computers use significantly less energy, which could lead to lower costs and decreased fossil-fuel dependency as adoption grows. (Disclosure: I’m also the CEO of a quantum-computing company.) Unlike classical computers, which use binary bits to encode information as 1s or 0s, quantum computers work using qubits. Thanks to the “weirdness” of quantum mechanical properties, qubits can represent both 1s and 0s at the same time, allowing quantum computers to find optimal solutions that classical systems cannot, all while using less energy. Here’s why: For a quantum processor to exhibit quantum mechanical effects, you have to isolate it from its surroundings. This is done by shielding it from outside noise and operating it at extremely low temperatures. Most quantum processors use cryogenic refrigerators to operate, and can reach about 15 millikelvin–that’s colder than interstellar space. At this low temperature, the processor is superconducting, which means that it can conduct electricity with virtually no resistance. As a result, this processor uses almost no power and generates almost no heat, so the power draw of a quantum computer—or the amount of energy it consumes—is just a fraction of a classical computer’s. And then there’s the price. Most modern classical supercomputers use between 1 to 10 megawatts of power on average, which is enough electricity to meet the instantaneous demand of almost 10,000 homes. As a year’s worth of electricity at 1 megawatt costs about $1 million in the US, this leads to multimillion-dollar price tags for operating these classical supercomputers. In contrast, each comparable quantum computer using 25 kilowatts of power costs about $25,000 per unit per year to run. Businesses are constantly looking for a competitive advantage, especially in an era of shrinking margins and fierce competition. In the case of computing, they’re looking for better, faster, or more efficient ways to solve problems than a classical computer. In the future, most quantum applications will utilize hybrid computing, which is a combination of classical and quantum computing that will provide a workable alternative to this unsustainable status quo—one that unlocks new commercial applications while dramatically curbing energy usage and costs. With hybrid, the hard parts of commercial computing that aren’t suitable for existing classical systems can be sent to a quantum processing unit and returned to a classical computer. High-energy portions of hybrid applications will be run on quantum computers—often through the cloud—while the low-energy pieces are reserved for classical. Hybrid computing means utilizing the best of both the quantum and classical worlds, and lowering the barriers for companies of all sizes to get started using quantum computers. Thanks in part to hybrid computing, early quantum applications are already being used in industries including automotive, manufacturing, and finance. Volkswagen is using quantum computers to build early applications that will be able to optimize public transportation routing in cities around the world. DENSO, a leading auto-parts manufacturer based in Japan, has reported that it can reduce gridlock and improve efficiency of autonomous robots on its factory floors with the help of an application built with a quantum computer. Quantum computing is showing signs of early benefits today, but there’s more to do before we see fully practical deployment of quantum computing in production. We need continued buy-in and investment from both governments and businesses to achieve widespread adoption. We also need to train and develop the next generation of expertise and talent in the quantum workforce. Finally, we need to continue breaking down barriers to using quantum computers with affordable, flexible cloud access and developer-friendly software and tools. Quantum computers hold the promise to solve today’s toughest business problems and impact the bottom line for companies in virtually every industry. They’re also a key tool we can use to combat the looming threat of ever-growing energy use of classical computing. Businesses are already starting to feel the pressure to get their heads in the quantum-computing game, but the impetus goes beyond innovation and technological competition for a single company. It extends to a collective goal: ensuring our world’s computing power doesn’t outstrip our planet’s ability to support it. Correction: The previous headline for this piece “We’ll run out of energy in 20 years if we don’t switch to quantum computing” overstated the threat to global energy production. The headline has been updated to better reflect the article text. In addition, the article has been updated to more accurately explain the costs of electricity generation and use.
<urn:uuid:1e000c85-57f7-4c55-9764-944a0a643926>
CC-MAIN-2021-04
https://qz.com/1566061/quantum-computing-will-change-the-way-the-world-uses-energy/?utm_campaign=Quantum%20Computing%20Weekly&utm_medium=email&utm_source=Revue%20newsletter
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704821253.82/warc/CC-MAIN-20210127055122-20210127085122-00283.warc.gz
en
0.924633
1,182
3.5625
4
If you are looking for some great science websites for interactive learning, then these eleven plus sites should, at the very least, scratch and itch. Most of these are aimed at younger learners but some will be as, if not more, entertaining for adults. 1. Khan Academy is great for people of all ages Khan Academy is one of the best resources for STEM learning on the web. And, guess what? It is free. This interactive website is filled to the brim with fantastic content led by professionals and teachers who are experts on the content, with the occasional STEM celebrity appearances. There is not that much gamification on this website. Most of the learning is done through fun interactive quizzes. The site is perfect if you need to build on the current topics you are learning from at school or are an adult. Khan Academy has courses for every level, from elementary school to college. 2. Curiosity Machine will teach you about AI Curiosity Machine helps children build, share, and receive feedback from experts. Its main focus is on teaching children, and their parents, about the power of Artificial Intelligence. Its main focus to bring family members together to learn and build their own AI. It has a specific "Family Challenge" which is a "free, hands-on AI education program that brings families, schools, communities, and technology know-it-alls together to give everyone the chance to learn, play and create with AI." Families will be guided through the basics of AI and are then encouraged to look around their local communities for potential problems to solve using their new skills. Proposals can then be submitted to win the competition. 3. Teachers TryScience is full of online experiments Teachers TryScience is a website specifically designed to spark any young mind's interest in science, technology, engineering, and math. At its very core, it aims to bring design-based learning to children at home or at school. According to the website, it helps children "to solve a problem in environmental science, students might need to employ physics, chemistry, and earth science concepts and skills." To this end, it has a large collection of interactive experiments, field trips, and other adventures. It also includes lesson plans, strategies, and tutorials for teachers to better help them deliver awe-inspiring science lessons for their ever-curious students. 4. The Exploratorium is the go-to site for interactive learning The Exploratorium is the website arm of the San Francisco Exploratorium. This site offers hands-on experiences that will help teach children about basic, and more complex, scientific principles. It covers subjects from many disciplines of science from biology and earth science to astronomy. The site also has a parent and teacher section that will provide free resources to help you plan and incorporate its interactive material to boost your child's learning. 5. Science Kids will engage your kid's mind Science Kids is another interactive learning website that focuses on teaching children the wonders of science. The site has a great variety of interactive science games covering subjects from living things to physical processes and everything in between. The great thing about this site's content is that it not only educates young minds but helps them put that knowledge to practical use to cement it in their memory. One particularly useful game will have your child design and build a virtual electrical circuit. Each subject comes in modules that are then subdivided into subcategories. Living things, by way of example, is divided into food chains, microbes, and the human body, etc. 6. BrainPOP will do just that BrainPOP is the place for science learning and it's very well designed to boot. It is a very active site for young students with a myriad of animations, movies, and short interactive quizzes. It covers topics like cellular life and genetics, ecology and behavior, forces of nature, our fragile environment, scientific inquiry, and paleontology and anthropology. So young aspiring scientist is bound to find something that will spark their interest. It also has some interactive coding lessons which are always fantastic ways to learn something they might not normally be exposed to. The site will have them hacking government websites in no time - only joking of course. 7. HHMI Biointeractive - it's in the name HHMI's website is full of great 3-D interactive, virtual labs, and printable activities for you to use. Its material is both engaging and interesting for science-buffs of all ages. These guys are famed for their award-winning virtual labs and high-quality informative videos so you know you are in good hands. Their site includes "Click & Learn" activities that include embedded video clips and animations, videos all of which have stop points and assessments to help check you've been paying attention. 8. Annenberg Learner Interactives is a great resource for Earth Science students Annenberg Learner Interactives' Earth Science-related topics are full of great and easy-to-understand graphics and other interactive content. It has a good collection of interactive lessons cover the big things like the Earth's structure to plate tectonics. The site also covers many other subjects within Earth Sciences, such as the Rock Cycle and Volcanoes, which really makes this subject come alive to any young student. It also has other resources for other scientific subjects with interactive games and other lessons. 9. National Geographic Kids is fun and educational Being created by National Geographic you know you can trust this site to be top quality. And it doesn't disappoint. This site includes a large collection of videos, interactive activities, and fun games that will keep children of all ages engaged for hours on end. National Geographic Kids' site is broken down into helpful subcategories for ease of navigating your child's learning. Each section contains extensive and informative write-ups on different animals from lions to whales supported with world-class National Geographic footage. Each section also includes memory games, quizzes, and other different activities to reinforce their learning by applying their new-found knowledge. 10. PhET Interactive Simulations is all about Physics simulations PhET Interactive Simulations is a real gem of an interactive and fun science-related website. Built and run by the University of Boulder, Colorado it has a vast collection of simulators covering most topics with physics from circuits to waves to quantum mechanics. Be warned, however, you might find yourself aimlessly playing around with variables without noticing hours of your precious time have passed by. Do not, we repeat do not, try the spring simulation it is too much fun. It also has some materials covering Earth Science, chemistry, and life sciences but these are far less extensive. 11. Wonderville is great for all ages Wonderville is another great science-related website that is packed with interactive activities for children. According to the website Wonderville "makes learning science fun for kids. We help teachers teach and students learn. Used in 170 countries, our awarding-winning STEM content helps create lifelong learners." Other than fun and entertaining games it also has a very good blog for the more curious children who want to go deeper into a subject. Adults love using Brilliant.org. The interactive games on this website do not try to teach you through memorization. The Brilliant team is dedicated to teaching you how to think critically about STEM topics. From Geometry to Quantum Computing, this website is an excellent way to spend your free time, if you are a dedicated life-long learner. Scientific Thinking is one of our favorite courses on Brilliant.Org. 13. The Raspberry Pi Foundation Raspberry Pi is a powerful but tiny affordable computer that can be used to do everything from creating your own DIY projects at home to learning programming. The mini-computer is great for both kids and adults interested in getting into the science of computing and programming.
<urn:uuid:ccdbe4b3-632d-4f9b-aa7d-9a2db3a5d684>
CC-MAIN-2021-04
https://interestingengineering.com/11-best-science-websites-for-interactive-learning
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703529128.47/warc/CC-MAIN-20210122051338-20210122081338-00684.warc.gz
en
0.942958
1,615
3.546875
4
Scientists at Cambridge University are working on combining two fields of solid state physics research, spintronics and superconductors, in order to develop what they hope could become the foundation for the next generation of datacenter technology — perhaps within the next decade. Data centers are the engines of the digital economy. But they are also very energy intensive in their own right — with the researchers citing estimates that some three per cent of the power generated in Europe is already being used by data centers. One impetus for the research is therefore to apply ‘superspin’ technology to substantially reduce the power consumption of high performance computing and data storage. Superconductors allow for the propagation of electrical charge without electronic resistance, and therefore hold out the tantalizing promise — in computing kit terms — of carrying electronic charge with zero energy loss. Albeit, at this stage in the research, there is still a question mark over whether the cooling requirements of utilizing superconductors will result in less energy consumption overall or not. Hence the need for further research. “Superconductivity necessarily requires low temperature,” explains Dr Jason Robinson, one of the project leads. “No one has discovered room temperature superconductivity. “The crunch question is: is the energy required to cool going to be smaller than current energy loss due to the energy efficiency of spintronics. If it costs more to cool than it currently does in terms of what we lose, currently, then it’s not worth it. That’s what we’re exploring.” “Our basic calculations suggest that superconducting spintronics will be massively more energy efficient than current spintronics,” he adds. Another driver for the research is to use superspin as a possible alternative to semiconductor technology — as a new route to sustain Moore’s Law of shrinking electronics, just as the ability of engineers to pack more transistors onto integrated circuits is starting to look like it’s coming to the end of the road. Spintronics proposes utilizing the spin alignment of electrons as a medium to store (the 0 or 1 of) digital data. “Information technology now is based on such small objects you just can’t use conventional superconductors,” notes Robinson. “By combining superconductivity with spintronics it’s not only that you can create circuits without [energy] dissipation, but it’s that you create new physics. So that means there’s a lot of new opportunities created through this combination. “There’s a lot of undiscovered physics to be explored.” The Cambridge-led project has received a £2.7 million grant from the UK’s Engineering and Physical Sciences Research Council, with a focus on developing a superconducting spintronics prototype device over the next five years to prove out their theoretical modeling that the combined tech is indeed more energy efficient than just using spintronics. “It’s important to understand that this is the first ever superconductivity and spintronics funded program,” adds Robinson. “The way the grant has been set up in the first three years there’s a series of parallel projects. Some are more applications biased than others but the application stuff has to develop alongside the science… Everything we do is moving us towards the prototype.” “It’s a fundamental program with the aim of triggering applications in spintronics. There’s a lot of science we don’t currently understand and we need to understand in order to be able to make the best possible prototype. We have enough science to know that we can make a prototype. The question is can we make the best prototype,” he adds. “[And] what do we need to do in order to be able to make a device that’s switchable — that you can not only store information on, but you can process information with as well.” The project draws on prior research conducted at Cambridge, and elsewhere, to combine spintronics and superconductors — a feat previously thought to be impossible, thanks to superconductivity canceling out electronic spin. However the same research group at Cambridge found a workaround for that, involving magnets. Initially utilizing a layer of a rare earth magnetic material — although the group has since proved that various magnetic materials can be used, according to Robinson. “A few years ago our group discovered that actually if you combine supercomputers with magnets you can create a new kind of Cooper Pair [paired electrons], which instead of having two electrons with anti-parallel spins you can create pairs which have parallel line spins. So now you have both the benefits of superconductivity and the ability to carry spin in the superconducting state.” Another area he is excited about from the combination of superconductivity and spintronics is the potential for using the technique to further quantum computing. “It introduces lots of new ideas that were not possible previously. So that’s exciting, and indeed a large part of our grant is to develop the science of those other areas as well,” he adds.
<urn:uuid:4b7fce48-4372-45bd-92d3-5d6251cebbd6>
CC-MAIN-2021-04
https://develop.techcrunch.com/2016/04/15/superspin-research-project-aims-to-drive-more-energy-efficient-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703547333.68/warc/CC-MAIN-20210124044618-20210124074618-00684.warc.gz
en
0.931329
1,078
3.65625
4
Where would we be without computers? Whether giving us the chance to work remotely, work on files with colleagues in real time, or for recreational activities like streaming – there can be no doubt that computing devices have changed the way we go about our day-to-day lives. However, while more ‘traditional’ computers are great for completing run-of-the-mill tasks, there are many more complex problems in the world that these machines will struggle to solve. For problems above a certain size and complexity, traditional machines simply don’t have enough computational power to tackle them. To put this in perspective, Fugaku, the world’s fastest supercomputer is over 1,000 times faster than a regular computer, and, in 2019 Google claimed its Sycamore quantum processor was more than a billion times faster at solving problems than a supercomputer. Given their processing superiority, if we want to have a chance at solving some of the world’s most complex issues, we must look to quantum computers. Understanding Quantum Computing In case you are unfamiliar with the concept, quantum computing leverages the substantial mechanics principles of superposition and entanglement in order to create states that scale exponentially with the number of quantum bits – or ‘qubits’. Rather than just being on or off, qubits can also be in what’s called ‘superposition’ – where they’re both on and off at the same time, or somewhere on a spectrum between the two. Put more simply, for scientists to properly simulate scientific situations, the calculations they make on a computer must be able to handle uncertainty in the way that traditional, and even supercomputers can’t. This is the key characteristic of quantum computing. Today, real quantum processors are used by researchers from all over the world to test out algorithms for applications in a variety of fields. Indeed, these computers may soon be able to spur the development of new breakthroughs in science, medication for currently incurable diseases, discovering materials to make more efficient devices and structures like more powerful solar panels as well as creating algorithms to quickly direct resources to where they are needed, such as ambulances. Quantum Computing and Cybersecurity However, not only do these machines have to be protected from hackers, they themselves could also pose a threat to digital life as we know it. Right now, for example, cyberattacks can be carried out with relative ease, due to the fact many organisations do not have protections in place for their confidential information. As such, placing a much greater emphasis on improving the security of communications and data storage is crucial for guaranteeing the protection of sensitive information for states, private entities and individuals, than say 20 years ago. However, if quantum computers can launch attacks that break asymmetric cryptography, they then render the entire PKI-based encryption method we currently use to protect our sensitive information, obsolete. Which begs the question: Then what? To take advantage of the time quantum computers will be able to break such systems, some countries are already beginning to collect encrypted foreign communications, with the expectation that they will be able to extract valuable secrets from that data in the future. Indeed, countries need to be aware that when quantum cryptanalysis does become available, it will significantly affect international relations by making any broadcast communications in the state open to decryption. For countries that extensively rely on encryption to secure military operations, diplomatic correspondence or other sensitive data, this could be a watershed event. As quantum computers continue to improve, businesses and the general public will become increasingly aware of the threat cryptographic systems pose to all digital security globally. The ability to update cryptographic algorithms, keys and certificates quickly in response to advances in cracking techniques and processing speed will therefore be key. To prepare for these inevitable cryptographic updates, more enterprises than ever will need to explore automation as a critical component for ensuring future-proofed security. Quantum resistant communication technology will soon be an inevitable part of cybersecurity mitigation. Business and technology strategists must monitor progress on the evolution and potential implications of quantum computing starting now. Confidential data, over-the-air software updates, identity management systems, connected devices, and anything else with long-term security obligations must be made quantum safe before large quantum computers are developed and are reliable, meaning their error rates are low. We have announced collaborations with ISARA Corporation and ID Quantique to make quantum-safe crypto more widely available for data protection in the cloud, applications and networks. Innovations like these can help combat the future security threats of quantum computing. With governments and organisations, such as NIST, racing to become cryptographically quantum resilient, readying enterprises for this change has never been more important. *** This is a Security Bloggers Network syndicated blog from Enterprise Security – Thales blog authored by Aline Gouget. Read the original post at: https://dis-blog.thalesgroup.com/security/2020/08/05/quantum-computing-and-the-evolving-cybersecurity-threat/
<urn:uuid:e7488572-c96f-4595-b646-d5101f5d5c61>
CC-MAIN-2021-04
https://securityboulevard.com/2020/08/quantum-computing-and-the-evolving-cybersecurity-threat/
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703581888.64/warc/CC-MAIN-20210125123120-20210125153120-00685.warc.gz
en
0.925142
1,030
3.6875
4
Quantum computing is inevitable; cryptography prepares for the future Quantum computing began in the early 1980s. It operates on principles of quantum physics rather than the limitations of circuits and electricity, which is why it is capable of processing highly complex mathematical problems so efficiently. Quantum computing could one day achieve things that classical computing simply cannot. The evolution of quantum computers has been slow. Still, work is accelerating, thanks to the efforts of academic institutions such as Oxford, MIT, and the University of Waterloo, as well as companies like IBM, Microsoft, Google, and Honeywell. IBM has held a leadership role in this innovation push and has named optimization the most likely application for consumers and organizations alike. Honeywell expects to release what it calls the “world’s most powerful quantum computer” for applications like fraud detection, optimization for trading strategies, security, machine learning, and chemistry and materials science. In 2019, the Google Quantum Artificial Intelligence (AI) team announced that their 53-qubit (analogous to bits in classical computing) machine had achieved “quantum supremacy.” This was the first time a quantum computer was able to solve a problem faster than any classical computer in existence. This was considered a significant milestone. Quantum computing will change the face of Internet security forever — particularly in the realm of cryptography, which is the way communications and information are secured across channels like the Internet. Cryptography is critical to almost every aspect of modern life, from banking to cellular communications to connected refrigerators and systems that keep subways running on time. This ultra-powerful, highly sophisticated new generation of computing has the potential to unravel decades of work that have been put into developing the cryptographic algorithms and standards we use today. Quantum computers will crack modern cryptographic algorithms Quantum computers can take a very large integer and find out its prime factor extremely rapidly by using Shor’s algorithm. Why is this so important in the context of cryptographic security? Most cryptography today is based on algorithms that incorporate difficult problems from number theory, like factoring. The forerunner of nearly all modern cryptographic schemes is RSA (Rivest-Shamir-Adleman), which was devised back in 1976. Basically, every participant of a public key cryptography system like RSA has both a public key and a private key. To send a secure message, data is encoded as a large number and scrambled using the public key of the person you want to send it to. The person on the receiving end can decrypt it with their private key. In RSA, the public key is a large number, and the private key is its prime factors. With Shor’s algorithm, a quantum computer with enough qubits could factor large numbers. For RSA, someone with a quantum computer can take a public key and factor it to get the private key, which allows them to read any message encrypted with that public key. This ability to factor numbers breaks nearly all modern cryptography. Since cryptography is what provides pervasive security for how we communicate and share information online, this has significant implications. Theoretically, if an adversary were to gain control of a quantum computer, they could create total chaos. They could create cryptographic certificates and impersonate banks to steal funds, disrupt Bitcoin, break into digital wallets, and access and decrypt confidential communications. Some liken this to Y2K. But, unlike Y2K, there’s no fixed date as to when existing cryptography will be rendered insecure. Researchers have been preparing and working hard to get ahead of the curve by building quantum-resistant cryptography solutions. When will a quantum computer be built that is powerful enough to break all modern cryptography? By some estimates, it may take 10 to 15 years. Companies and universities have made a commitment to innovation in the field of quantum computing, and progress is certainly being made. Unlike classical computers, quantum computers rely on quantum effects, which only happen at the atomic scale. To instantiate a qubit, you need a particle that exhibits quantum effects like an electron or a photon. These particles are extremely small and hard to manage, so one of the biggest hurdles to the realization of quantum computers is how to keep the qubits stable long enough to do the expensive calculations involved in cryptographic algorithms. Both quantum computing and quantum-resistant cryptography are works in progress It takes a long time for hardware technology to develop and mature. Similarly, new cryptographic techniques take a long time to discover and refine. To protect today’s data from tomorrow’s quantum adversaries, we need new cryptographic techniques that are not vulnerable to Shor’s algorithm. The National Institute of Standards and Technology (NIST) is leading the charge in defining post-quantum cryptography algorithms to replace RSA and ECC. There is a project currently underway to test and select a set of post-quantum computing-resistant algorithms that go beyond existing public-key cryptography. NIST plans to make a recommendation sometime between 2022 and 2024 for two to three algorithms for both encryption and digital signatures. As Dustin Moody, NIST mathematician points out, the organization wants to cover as many bases as possible: “If some new attack is found that breaks all lattices, we’ll still have something to fall back on.” We’re following closely. The participants of NIST have developed high-speed implementations of post-quantum algorithms on different computer architectures. We’ve taken some of these algorithms and tested them in Cloudflare’s systems in various capacities. Last year, Cloudflare and Google performed the TLS Post-Quantum Experiment, which involved implementing and supporting new key exchange mechanisms based on post-quantum cryptography for all Cloudflare customers for a period of a few months. As an edge provider, Cloudflare was well positioned to turn on post-quantum algorithms for millions of websites to measure performance and use these algorithms to provide confidentiality in TLS connections. This experiment led us to some useful insights around which algorithms we should focus on for TLS and which we should not (sorry, SIDH!). More recently, we have been working with researchers from the University of Waterloo and Radboud University on a new protocol called KEMTLS, which will be presented at Real World Crypto 2021. In our last TLS experiment, we replaced the key negotiation part of TLS with quantum-safe alternatives but continued to rely on digital signatures. KEMTLS is designed to be fully post-quantum and relies only on public-key encryption. On the implementation side, Cloudflare team members including Armando Faz Hernandez and visiting researcher Bas Westerbaan have developed high-speed assembly versions of several of the NIST finalists (Kyber, Dilithium), as well as other relevant post-quantum algorithms (CSIDH, SIDH) in our CIRCL cryptography library written in Go. Post-quantum security, coming soon? Everything that is encrypted with today’s public key cryptography can be decrypted with tomorrow’s quantum computers. Imagine waking up one day, and everyone’s diary from 2020 is suddenly public. Although it’s impossible to find enough storage to record keep all the ciphertext sent over the Internet, there are current and active efforts to collect a lot of it. This makes deploying post-quantum cryptography as soon as possible a pressing privacy concern. Cloudflare is taking steps to accelerate this transition. First, we endeavor to use post-quantum cryptography for most internal services by the end of 2021. Second, we plan to be among the first services to offer post-quantum cipher suites to customers as standards emerge. We’re optimistic that collaborative efforts among NIST, Microsoft, Cloudflare, and other computing companies will yield a robust, standards-based solution. Although powerful quantum computers are likely in our future, Cloudflare is helping to make sure the Internet is ready for when they arrive.
<urn:uuid:424d26b3-3265-4594-89b8-4a9d9ea2cdb3>
CC-MAIN-2021-04
https://engineeringjobs4u.co.uk/securing-the-post-quantum-world
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610704833804.93/warc/CC-MAIN-20210127214413-20210128004413-00285.warc.gz
en
0.941563
1,624
3.703125
4
Physicists from MIPT and the Russian Quantum Center have developed an easier method to create a universal quantum computer using multilevel quantum systems (qudits), each one of which is able to work with multiple "conventional" quantum elements – qubits. Professor Vladimir Man'ko, Aleksey Fedorov and Evgeny Kiktenko have published the results of their studies of multilevel quantum systems in a series of papers in Physical Review A, Physics Letters A, and also Quantum Measurements and Quantum Metrology. "In our studies, we demonstrated that correlations similar to those used for quantum information technologies in composite quantum systems also occur in non-composite systems – systems which we suppose may be easier to work with in certain cases. In our latest paper we proposed a method of using entanglement between internal degrees of freedom of a single eight-level system to implement the protocol of quantum teleportation, which was previously implemented experimentally for a system of three two-level systems," says Vladimir Man'ko. Quantum computers, which promise to bring about a revolution in computer technology, could be built from elementary processing elements called quantum bits – qubits. While elements of classical computers (bits) can only be in two states (logic zero and logic one), qubits are based on quantum objects that can be in a coherent superposition of two states, which means that they can encode the intermediate states between logic zero and one. When a qubit is measured, the outcome is either a zero or a one with a certain probability (determined by the laws of quantum mechanics). In a quantum computer, the initial condition of a particular problem is written in the initial state of the qubit system, then the qubits enter into a special interaction (determined by the specific problem). Finally, the user reads the answer to the problem by measuring the final states of the quantum bits. Quantum computers will be able to solve certain problems that are currently far beyond the reach of even the most powerful classical supercomputers. In cryptography, for example, the time required for a conventional computer to break the RSA algorithm, which is based on the prime factorization of large numbers, would be comparable to the age of the universe. A quantum computer, on the other hand, could solve the problem in a matter of minutes. However, there is a significant obstacle standing in the way of a quantum revolution – the instability of quantum states. Quantum objects that are used to create qubits – ions, electrons, Josephson junctions etc. can only maintain a certain quantum state for a very short time. However, calculations not only require that qubits maintain their state, but also that they interact with one another. Physicists all over the world are trying to extend the lifespan of qubits. Superconducting qubits used to "survive" only for a few nanoseconds, but now they can be kept for milliseconds before decoherence – which is closer to the time required for calculations. In a system with dozens or hundreds of qubits, however, the problem is fundamentally more complex. Man'ko, Fedorov, and Kiktenko began to look at the problem from the other way around – rather than try to maintain the stability of a large qubit system, they tried to increase the dimensions of the systems required for calculations. They are investigating the possibility of using qudits rather than qubits for calculations. Qudits are quantum objects in which the number of possible states (levels) is greater than two (their number is denoted by the letter D). There are qutrits, which have three states; ququarts, which have four states, etc. Algorithms are now actively being studied in which the use of qudits could prove to be more beneficial than using qubits. "A qudit with four or five levels is able to function as a system of two "ordinary" qubits, and eight levels is enough to imitate a three-qubit system. At first, we saw this as a mathematical equivalence allowing us to obtain new entropic correlations. For example, we obtained the value of mutual information (the measure of correlation) between virtual qubits isolated in a state space of a single four-level system," says Fedorov. He and his colleagues demonstrated that on one qudit with five levels, created using an artificial atom, it is possible to perform full quantum computations—in particular, the realization of the Deutsch algorithm. This algorithm is designed to test the values of a large number of binary variables. It can be called the fake coin algorithm: imagine that you have a number of coins, some of which are fake – they have the same image on the obverse and reverse sides. To find these coins using the "classical method", you have to look at both sides. With the Deutsch algorithm, you "merge" the obverse and reverse sides of the coin and you can then see a fake coin by only looking at one side. The idea of using multilevel systems to emulate multi-qubit processors was proposed earlier in the work of Russian physicists from the Kazan Physical-Technical Institute. To run a two-qubit Deutsch algorithm, for example, they proposed using a nuclear spin of 3/2 with four different states. In recent years, however, experimental progress in creating qudits in superconducting circuits has shown that they have a number of advantages. However, superconducting circuits require five levels: the last level performs an ancillary role to allow for a complete set of all possible quantum operations. "We are making significant progress, because in certain physical implementations, it is easier to control multilevel qudits than a system of the corresponding number of qubits, and this means that we are one step closer to creating a full-fledged quantum computer. Multilevel elements offer advantages in other quantum technologies too, such as quantum cryptography," says Fedorov. More information: E.O. Kiktenko, A.K. Fedorov, O.V. Man'ko, and V.I. Man'ko. Multilevel superconducting circuits as two-qubit systems: Operations, state preparation, and entropic inequalities // Physical Review A arxiv.org/abs/1411.0157 E.O. Kiktenko, A.K. Fedorov, A.A. Strakhov, and V.I. Man'ko. Single qudit realization of the Deutsch algorithm using superconducting many-level quantum circuits // Physics Letters A 379, 1409–1413 (2015), arxiv.org/abs/1503.01583 E.O. Kiktenko, A.K. Fedorov, and V.I. Man'ko. Teleportation in an indivisible quantum system // Quantum Measurements and Quantum Metrology 3, 13–19 (2016), arxiv.org/abs/1512.05168 Provided by Moscow Institute of Physics and Technology
<urn:uuid:7a9f89f4-a64f-4de8-8ee2-fb8aeac265cc>
CC-MAIN-2021-04
https://phys.org/news/2016-07-russian-physicists-approach-quantum.html?deviceType=mobile
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703497681.4/warc/CC-MAIN-20210115224908-20210116014908-00086.warc.gz
en
0.932899
1,465
3.671875
4
Post-quantum cryptography, also called quantum encryption, is the development of cryptographic systems for classical computers that are able to prevent attacks launched by quantum computers. During the 1980s, scientists speculated that if computers could take advantage of the unique properties of quantum mechanics, they could perform complicated computations much faster than classical, binary computers. It quickly became clear that a quantum computer, taking advantage of quantum properties such as superposition and entanglement, could complete certain types of complex calculations in a matter of hours -- even though it would take a classical computer several years to complete the same calculation. During the 1990s, after mathematician Peter Shor successfully demonstrated that a theoretical quantum computer could easily break the algorithm used for public key encryption (PKE), cryptographers around the world began to explore what a post-quantum cryptography system would look like. As of this writing, standards for how to implement post-quantum encryption are still emerging. Pre-quantum vs. quantum vs. post-quantum cryptography Quantum computers use the laws of quantum mechanics to process information in quantum bits (qubits). Because each qubit can be a combination of 0s and 1s, a quantum computer can process variables exponentially faster than a classical, binary computer. Pre-quantum cryptography uses a specific type of cipher called an algorithm to transform human-readable data into secret code. The challenge of pre-quantum cryptography is to make encryption ciphers easy to understand but difficult to reverse engineer. In contrast, quantum cryptography relies on the physical properties of atoms and uses geometric ciphers to transform human-readable data into unbreakable secret code. A major challenge of post-quantum cryptography is that quantum physics is still an emerging scientific field of study, and prototypes for quantum computers are very expensive to build and operate. The quest for quantum-resistant algorithms In 2016, researchers from MIT and the University of Innsbruck built a small quantum computer that was able to successfully implement Shor's algorithm and find the factors for the number 15. Once researchers were able to demonstrate that Shor's quantum algorithm could be used to return the correct factors with a confidence level that exceeded 99%, it quickly became clear that the world's most widely used cryptographic methods could be broken by a quantum computer. In 2016, the National Institute of Standards and Technology (NIST) began to seek out submissions for algorithms that could potentially replace public key encryption, key encapsulation mechanisms (KEMs) and digital signatures. In response, mathematicians and programmers began experimenting with a wide variety of strategies to replace integer factorization as well as the discrete logarithmic problems used in the Rivest-Shamir-Adleman (RSA) algorithm, Elliptic Curve Digital Signature Algorithm (ECDSA), Elliptic Curve Diffie–Hellman Key Exchange (ECDH) and Digital Signature Algorithm (DSA) cryptosystems. Google's experiments in post-quantum cryptography, for example, involve coupling a classical elliptic curve algorithm with a post-quantum algorithm. The idea is that even if the quantum cryptography turns out to be breakable, the addition of an elliptic curve algorithm will still provide a measure of security. Other popular strategies for creating quantum-resistant algorithms include the use of lattice, code-based and multivariate schemes. As of this writing, lattice schemes seem to be the most promising, perhaps because it's extremely difficult to calculate the shortest vector of a large lattice when the shortest vector is quantum and can exist in more than one dimension. The future of post-quantum cryptography The algorithms that support encryption today, including public key cryptography, are still considered to be safe for e-commerce because while quantum computing is real, the technology is expensive and use cases have their roots in scientific and government research. The race is on, however, among researchers who are trying to find a post-quantum encryption that works and researchers who are trying to break RSA and similar cryptosystems with quantum algorithms. Many experts believe that we will reach quantum supremacy within nine or 10 years, at which time RSA and similar asymmetrical algorithms will no longer be able to protect sensitive data. This is why NIST is moving so aggressively to create a standard for post-quantum encryption. Experts recommend that while NIST is busy evaluating the effectiveness of proposed standards for post-quantum cryptography, organizations use the next couple years to create a reference index for those applications that use encryption. Organizations should also keep track of the public and third-party encryption libraries. Once the strategies for implementing post-quantum cryptography have matured and a standard has been approved, the index can be used to develop a plan for how the organization will either replace or upgrade those applications that require cryptography. Post-quantum cryptography vs. quantum key distribution Post-quantum cryptography should not be confused with quantum key distribution (QKD). QKD simply allows a secret cryptographic key to be shared between two remote parties in such a way that key interception can be easily detected.
<urn:uuid:982779d5-9177-4734-83c2-0bfd8881c5e6>
CC-MAIN-2021-04
https://searchsecurity.techtarget.com/definition/post-quantum-cryptography
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703519395.23/warc/CC-MAIN-20210119135001-20210119165001-00686.warc.gz
en
0.928534
1,039
3.96875
4
It seems that diamonds grown in a lab will have many roles to play in the electronics and engineering of the future. This could be because these precious stones contain (necessary) defects, often worked into the diamonds to order. The defects are created when a non-carbon atom takes the place of a carbon atom in the orderly molecular lattice that normally makes a diamond. Nitrogen vacancies (NVs), for example, have drawn some attention due to their potential in diagnostics, and their ability to emit red light when green light hits them, which has potential analytical value. NVs have recently demonstrated the ability to assess the flow of electrons across tiny strips of graphene with great accuracy and sensitivity. This may be important, particularly if graphene realizes its promise as the superconductor of the future. Diamonds in Quantum Computing Diamonds with vacancies could be used to transmit or store data at the quantum level. In other words, they could be used as repeaters across networks between computers, all of which would be…quantum in nature. Unfortunately, NVs have relatively poor optical qualities, meaning that their applications may be limited. For example, nitrogen vacancies may degrade the quality or integrity of quantum data over distance. This is a shame, as NVs also have long lifespans. Therefore, a team of scientists at Princeton (in collaboration with others from the Gemological Institute of America in New York and the UK company, Element Six) decided to investigate the potential of alternative vacancy types in the transfer and storage of quantum bits (or qubits). Previous work indicated that silicon vacancies (SiVs) significantly improved optical properties compared to NVs. However, SiVs also have a charge (namely 2+), which may impact their interactions with protons and electrons that are necessary for quantum data-transmission. More specifically, the charged vacancies, from past research, showed that SiVs did not hold coherence among the phonons (represent ‘noise’ in a quantum data system) for the required lengths of time. This quantum-capable wafer may be only so useful without the ability to network with other quantum processors. (Source: Steve Jurvetson @ flickr) Neutral Silicon Vacancies: Production and Testing The team hypothesized that SiVs without a charge or neutral SiVs (SiV0s) could solve these problems. Therefore, they commissioned the company, Element Six, to synthesize diamonds, which the investigators then treated with heat to implant silicon ions into the material. This process required repeated tweaking and tuning before the successful production of diamonds with SiV0s. These vacancies exhibited a coherence time of nearly 1 second, and spin-lattice relaxation within approximately 1 minute. These favorable properties were accompanied by desirable optical linewidths and excellent light-emission specifications. The team also reported that these attributes allowed for successful quantum entanglement; in other words, super-secure data transmission between two quantum sources. The group was confident that their SIV0s would be capable of repeating qubits (which are often encoded into photons) across a network. The researchers have also proposed further studies, in which a system will be designed to test this concept out. The project would likely include a simple quantum computer or computer with SiV0s as the data interface. On the other hand, the successful production of SiV0s, which may be able to transfer quantum data, is a considerable achievement. This study may be the first step towards the establishment of quantum networks that depend on diamonds with silicon vacancies. In addition, the solid medium may have advantages over others (e.g., in conventional fiber-optic cables) in terms of quantum data integrity. Furthermore, this SiV0s may also be useful for quantum data storage. On the other hand, SiV0s do not have the lifespan associated with NVs. This may be a problem for the future of quantum computing. Vacancies cause the light within diamonds to act slightly differently compared to that of flawless stones. (Source: de Leon lab, Princeton) Vacancies within diamonds have been vilified for centuries, as merchants and jewelers were aware of their flaws due to the colored light they caused gems to reflect. However, scientists have found much to value in these so-called defects. They can be exploited to produce cutting-edge optical and nanoscopic diagnostic tools. In addition, as demonstrated in a recent issue of the journal Science, certain vacancies can confer the data-repeating abilities necessary for true, networked quantum computing. Therefore, this study may help unlock the potential of quantum processing, which is the next step towards greater complexity and flexibility in computing. On the other hand, scientists will have to find a way to make the silicon vacancies in question last as long as their nitrogen counterparts previously did. Top Image: A perfect diamond. (Source: http://pngimg.com) Implanting diamonds with flaws to provide key technology for quantum communications, 2018, Princeton News, https://www.princeton.edu/news/2018/07/05/implanting-diamonds-flaws-provide-key-technology-quantum-communications , (accessed 10 Jul. 18) B. C. Rose, et al. (2018) Observation of an environmentally insensitive solid-state spin defect in diamond. Science. 361:(6397). pp.60-63. Graphene is the New Silicon? – A Closer Look at the Most Likely Next-Generation Superconductor, 2017, Evolving Science, https://www.evolving-science.com/information-communication-computer-science-technology/graphene-new-silicon-closer-look-most-likely-next-generation-superconductor-00415 , (accessed on 10 Jul. 18)
<urn:uuid:baf41652-c5dc-47d3-9ce7-16858340dc02>
CC-MAIN-2021-04
https://www.evolving-science.com/matter-energy/quantum-networks-synthetic-diamonds-00720
s3://commoncrawl/crawl-data/CC-MAIN-2021-04/segments/1610703519883.54/warc/CC-MAIN-20210120023125-20210120053125-00486.warc.gz
en
0.935892
1,216
3.609375
4
Binary refers to any system that uses two alternative states, components, conditions or conclusions. The binary, or base 2, numbering system uses combinations of just two unique numbers, i.e., zero and one, to represent all values, in contrast with the decimal system (base 10), which uses combinations of ten unique numbers, i.e., zero through nine. Virtually all electronic computers are designed to operate internally with all information encoded in binary numbers. This is because it is relatively simple to construct electronic circuits that generate two distinct voltage levels (i.e., off and on or low and high) to represent zero and one. The reason is that transistors and capacitors, which are the fundamental components of processors (the logic units of computers) and memory, generally have only two distinct states: off and on. The values of bits are stored in various ways, depending on the medium. For example, the value of each bit is stored as an electrical charge in a single capacitor within a RAM (random access memory) chip. It is stored as the magnetization of a microscopic area of magnetic material on a platter in a hard disk drive (HDD) or on a floppy disk. It is stored along the spiral track on an optical disk as a change from a pit to the surface or from the surface to a pit (representing a one) and as no change (representing a zero). Computers are almost always designed to store data and execute instructions in larger and more meaningful units called bytes, although they usually also provide ways to test and manipulate single bits. Bytes are abbreviated with an upper case B, and bits are abbreviated with a lower case b. The number of bits in a byte varied according to the manufacturer and model of computer in the early days of computing, but today virtually all computers use bytes that consist of eight bits. Whereas a bit can have only one of two values, an eight-bit byte can have any of 256 possible values, because there are 256 possible permutations (i.e., combinations of zero and one) for eight consecutive bits (i.e., 28). Thus, an eight-bit byte can represent any unsigned integer from zero through 255 or any signed integer from -128 to 127. It can also represent any character (i.e., letter, number, punctuation mark or symbol) in a seven-bit or eight-bit character encoding system (such as ASCII, the default character encoding used on most computers). The number of bits is often used to classify generations of computers and their components, particularly CPUs (central processing units) and busses and to provide an indication of their capabilities. However, such terminology can be confusing or misleading when used in an imprecise manner, which it frequently is. For example, classifying a computer as a 32-bit machine might mean that its data registers are 32 bits wide, that it uses 32 bits to identify each address in memory or that its address buses or data buses of that size. A register is a very small amount of very fast memory that is built into the CPU in order to speed up its operations by providing quick access to commonly used values. Whereas using more bits for registers makes computers faster, using more bits for addresses enables them to support larger programs. A bus is a set of wires that connects components within a computer, such as the CPU and the memory. A 32-bit bus transmits 32 bits in parallel (i.e., simultaneously rather than sequentially). Although CPUs that treat data in 32-bit chunks (i.e., processors with 32-bit registers and 32-bit memory addresses) still constitute the personal computer mainstream, 64-bit processors are common in high-performance servers and are now being used in an increasing number of personal computers as well. The rate of data transfer in computer networks and telecommunications systems is referred to as the bit rate or bandwidth, and it is usually measured in terms of some multiple of bits per second, abbreviated bps, such as kilobits, megabits or gigabits (i.e., billions of bits) per second. A bitmap is a method of storing graphics (i.e., images) in which each pixel (i.e., dot that is used to form an image on a display screen) is stored as one or several bits. Graphics are also often described in terms of bit depth, which is the number of bits used to represent each pixel. A single-bit pixel is monochrome (i.e., either black or white), a two-bit pixel can represent any of four colors (or black and white and two shades of gray), an eight bit pixel can represent 256 colors and 24-bit and 32-bit pixels support highly realistic color which is referred to as true color. The word bit was invented in the latter half of the 1940s by John W. Tukey (1915-2000), an eminent statistician, while working at Bell Labs (the research arm of AT&T, the former U.S. telecommunications monopoly). He coined it as a contraction of the term binary digit and as a handier alternative to bigit or binit. Tukey also coined the word software. The term bit was first used in an influential publication by Claude E. Shannon (1916-2001), also while at Bell Labs, in his seminal 1948 paper A Mathematical Theory of Communication. Shannon, widely regarded as the father of information theory, developed a theory that for the first time treated communication as a rigorously stated mathematical problem and provided communications engineers with a technique for determining the capacities of communications channels in terms of of bits. Although the bit has been the smallest unit of storage used in computing so far, much research is being conducted on qubits, the basic unit of information in quantum computing (which is based on phenomena that occur at the atomic and subatomic levels). Qubits hold an exponentially greater amount of information than conventional bits. Created March 4, 2005. Updated April 5, 2006.
<urn:uuid:cf75d422-e50f-41de-b1ad-4b5f7c2f33cb>
CC-MAIN-2014-35
http://www.linfo.org/bit.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1409535919886.18/warc/CC-MAIN-20140909055318-00483-ip-10-180-136-8.ec2.internal.warc.gz
en
0.950656
1,236
4.03125
4
MIT researchers have created a new imaging system that can acquire visual data at a rate of one trillion exposures per second. That’s fast enough to produce a slow-motion video of a burst of light traveling the length of a one-liter bottle, bouncing off the cap and reflecting back to the bottle’s bottom. Media Lab postdoc Andreas Velten, one of the system’s developers, calls it the “ultimate” in slow motion: “There’s nothing in the universe that looks fast to this camera,” he says. Picosecond Camera for Time-of-Flight Imaging Slow art with a trillion frames per second camera How will the world look with a one trillion frame per second camera? Although such a camera does not exist today, we converted high end research equipment to produce conventional movies at 0.5 trillion (5· 10^11) frames per second, with light moving barely 0.6 mm in each frame. Our camera has the game changing ability to capture objects moving at the speed of light. Inspired by the classic high speed photography art of Harold Edgerton [Kayafas and Edgerton 1987] we use this camera to capture movies of several scenes. The system relies on a recent technology called a streak camera, deployed in a totally unexpected way. The aperture of the streak camera is a narrow slit. Particles of light — photons — enter the camera through the slit and pass through an electric field that deflects them in a direction perpendicular to the slit. Because the electric field is changing very rapidly, it deflects late-arriving photons more than it does early-arriving ones. The image produced by the camera is thus two-dimensional, but only one of the dimensions — the one corresponding to the direction of the slit — is spatial. The other dimension, corresponding to the degree of deflection, is time. The image thus represents the time of arrival of photons passing through a one-dimensional slice of space. The camera was intended for use in experiments where light passes through or is emitted by a chemical sample. Since chemists are chiefly interested in the wavelengths of light that a sample absorbs, or in how the intensity of the emitted light changes over time, the fact that the camera registers only one spatial dimension is irrelevant. But it’s a serious drawback in a video camera. To produce their super-slow-mo videos, Velten, Media Lab Associate Professor Ramesh Raskar and Moungi Bawendi, the Lester Wolfe Professor of Chemistry, must perform the same experiment — such as passing a light pulse through a bottle — over and over, continually repositioning the streak camera to gradually build up a two-dimensional image. Synchronizing the camera and the laser that generates the pulse, so that the timing of every exposure is the same, requires a battery of sophisticated optical equipment and exquisite mechanical control. It takes only a nanosecond — a billionth of a second — for light to scatter through a bottle, but it takes about an hour to collect all the data necessary for the final video. Doing the math After an hour, the researchers accumulate hundreds of thousands of data sets, each of which plots the one-dimensional positions of photons against their times of arrival. Raskar, Velten and other members of Raskar’s Camera Culture group at the Media Lab developed algorithms that can stitch that raw data into a set of sequential two-dimensional images. The streak camera and the laser that generates the light pulses — both cutting-edge devices with a cumulative price tag of $250,000 — were provided by Bawendi, a pioneer in research on quantum dots: tiny, light-emitting clusters of semiconductor particles that have potential applications in quantum computing, video-display technology, biological imaging, solar cells and a host of other areas. The trillion-frame-per-second imaging system, which the researchers have presented both at the Optical Society's Computational Optical Sensing and Imaging conference and at Siggraph, is a spinoff of another Camera Culture project, a camera that can see around corners. That camera works by bouncing light off a reflective surface — say, the wall opposite a doorway — and measuring the time it takes different photons to return. But while both systems use ultrashort bursts of laser light and streak cameras, the arrangement of their other optical components and their reconstruction algorithms are tailored to their disparate tasks. Because the ultrafast-imaging system requires multiple passes to produce its videos, it can’t record events that aren’t exactly repeatable. Any practical applications will probably involve cases where the way in which light scatters — or bounces around as it strikes different surfaces — is itself a source of useful information. Those cases may, however, include analyses of the physical structure of both manufactured materials and biological tissues — “like ultrasound with light,” as Raskar puts it. As a longtime camera researcher, Raskar also sees a potential application in the development of better camera flashes. “An ultimate dream is, how do you create studio-like lighting from a compact flash? How can I take a portable camera that has a tiny flash and create the illusion that I have all these umbrellas, and sport lights, and so on?” asks Raskar, the NEC Career Development Associate Professor of Media Arts and Sciences. “With our ultrafast imaging, we can actually analyze how the photons are traveling through the world. And then we can recreate a new photo by creating the illusion that the photons started somewhere else.” “It’s very interesting work. I am very impressed,” says Nils Abramson, a professor of applied holography at Sweden’s Royal Institute of Technology. In the late 1970s, Abramson pioneered a technique called light-in-flight holography, which ultimately proved able to capture images of light waves at a rate of 100 billion frames per second. But as Abramson points out, his technique requires so-called coherent light, meaning that the troughs and crests of the light waves that produce the image have to line up with each other. “If you happen to destroy the coherence when the light is passing through different objects, then it doesn’t work,” Abramson says. “So I think it’s much better if you can use ordinary light, which Ramesh does.” Research project website for the trillion frame per second camera If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks
<urn:uuid:6a04d71d-0366-4e38-a2cc-33baf2de56a4>
CC-MAIN-2014-35
http://nextbigfuture.com/2011/12/trillion-frame-per-second-video.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1409535921869.7/warc/CC-MAIN-20140901014521-00015-ip-10-180-136-8.ec2.internal.warc.gz
en
0.923886
1,373
3.546875
4
Angle speeds plastic transistor Technology Research News Plastic computer chips have recently received a lot of attention because they promise to imbue everyday objects with inexpensive electronic intelligence and enable flexible displays and electronic Though they are flexible and potentially very inexpensive, organic electronic devices perform relatively poorly. This is because organic materials have low charge carrier mobility, which is a measure of how readily electricity -- or negatively-charged electrons and positively-charged holes -- moves through the material. Researchers from Lucent Technologies' Bell Laboratories, Rutgers University and the University of Illinois have found that the orientation of crystalline organic semiconductors plays a big role in organic transistor performance. The researchers have developed a simple lamination manufacturing process for making transistors from the fragile organic material, and the resulting transistors have set a record for carrier mobility in organic The researchers' method could lead to mass production techniques for organic transistors and light-emitting diodes. The researchers' field-effect transistor is formed from organic rubrene crystal and titanium and gold electrodes and has a carrier mobility of 15.4 square centimeters per volt second, compared to typical organic semiconductor carrier mobilities of less than one square centimeter per volt second, according to John Rogers, a professor of materials science and engineering at the University of Illinois. The silicon transistors commonly used in today's computer chips have carrier mobilities of 1,500 square centimeters per volt second, and other inorganic crystalline semiconductors can have carrier mobilities an order of magnitude higher than silicon. The orientation of the molecules within the organic crystal and the spacing between the molecules contribute to the prototype's relatively high carrier mobility, said Rogers. Crystal molecules in the prototype transistor are spaced 1.44 nanometers apart in one direction and 0.72 nanometers in the perpendicular direction. Carrier mobility dropped to 4.4 square centimeters per volt second when the wide spacing of the crystal was aligned with the electrodes. A nanometer is one millionth of a millimeter. The rubrene molecule has groups of atoms attached to its sides, and electrons flow along these side groups and along the backbone of the molecule. In the high-mobility orientation, the molecules' side groups are aligned, facilitating electron flow from molecule to molecule. "The orientation of the molecules relative to the electrodes of the transistors has a profound impact on the way [the] devices behave," said Rogers. To test the relationship between orientation and performance in the organic crystal, the researchers developed a method of making field effect transistors that allowed them to repeatedly place, remove, rotate and replace the relatively fragile crystal on the transistor's electrodes. "We build all components of the transistor -- source/drain electrodes, gate dielectric, and gate electrode -- out of soft, conformable materials built on a soft, elastomeric substrate," said Rogers. "We then, at room temperature and without applied pressure, gently place the organic crystal, which is grown in a separate process... on the surface of this transistor To make the transistors, the researchers placed a titanium-gold gate electrode on a silicone rubber surface, covered it with a thin film of silicone rubber and placed titanium-gold source and drain electrodes on top. They then simply placed the organic crystal over the electrodes and gently pressed one edge of the crystal. This caused the crystal to adhere to the silicone and metal due to the van der Waals force, which is the electrostatic attraction between atoms and molecules. "The soft contact forms very high-performance transistors in a way that avoids all of the hazards that conventional semiconductor processing poses to the organics," said Rogers. The lamination method could be used in practical applications in three to five years, said Rogers. Rogers' research colleagues were Bell Laboratories researchers Vikram Sundar, now at IBM, Jana Zaumseil, now at the University of Cambridge, Robert Willett, and Takao Someya, now at the University of Tokyo; Vitaly Podzorov and Michael Gershenson of Rutgers University; and Etienne Menard of the University of Illinois. They published the research in the March 12, 2004 issue of Science. The research was funded by the National Science Foundation (NSF) and the U.S. Department of Energy. Timeline: 3-5 years TRN Categories: Integrated Circuits; Materials Science Story Type: News Related Elements: Technical paper, "Elastomeric Transistor Stamps: Reversible Probing of Charge Transport in Organic Crystals," Science, March 12, 2004 April 7/14, 2004 Net plan builds in search Robot guided by its voice Angle speeds plastic Sturdy quantum computing DNA folds into Fiber spun from Nano ribbons coil combo tracks viruses Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:bdf82f50-1c29-4ecf-8c4e-f7e5820afabd>
CC-MAIN-2014-35
http://www.trnmag.com/Stories/2004/040704/Angle_speeds_plastic_transistor_040704.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500824209.82/warc/CC-MAIN-20140820021344-00304-ip-10-180-136-8.ec2.internal.warc.gz
en
0.851456
1,081
3.5625
4
In atomic physics , hyperfine coupling is the weak magnetic interaction between electrons . Hyperfine coupling causes the hyperfine splitting of atomic or molecular energy levels. The totality of energy levels spawned by hyperfine splitting is called the hyperfine structure of the atom's or molecule's spectrum The following terminology has evolved to describe atomic and/or molecular spectra: - The gross structure is due to the energy difference of electronic orbitals with different principal quantum number n. - The fine structure occurs only for n>0; it is due to the spin-orbit coupling (the energy difference between the electron spin being parallel or antiparallel to the electron's orbital moment). - The hyperfine structure is due to an unpaired electron interacting with a nucleus having nuclear spin quantum number I 0. The electron and nucleus (nuclei) are on the same atom or within the same molecule. - The superhyperfine structure is due to an unpaired electron interacting with a nucleus having I 0. The electron and nucleus (nuclei) are on different atoms or different molecules. - The spin-spin structure is due to interactions among nuclei having I 0. This phenomenon is especially important in NMR spectra In first order, hyperfine coupling is a magnetic dipole -dipole interaction, arising from the interaction of the nuclear magnetic moment with the magnetic field of the electron. According to classical thinking, the electron moving around the nucleus has a magnetic dipole moment, because it is charged. The interaction of this magnetic dipole moment with the magnetic moment of the nucleus (due to its spin) leads to hyperfine splitting. However, due to the electron's spin, there is also hyperfine splitting for s-shell electrons, which have zero orbital angular momentum. In this case, the magnetic dipole interaction is even stronger, as the electron probability density does not vanish inside the nucleus (). The amount of correction to the Bohr energy levels due to hyperfine splitting of the hydrogen atom is of the order - is the mass of an electron, - is the mass of a proton, - is the fine structure constant , and - is the speed of light. For atoms other than hydrogen, the nuclear spin and the total electron angular momentum get coupled, giving rise to the total angular momentum . The hyperfine splitting is then - the magnetic dipole moment of the nucleus, and - is the atomic magnetic field. This interaction obeys the Lande interval rule: The energy level is split into energy levels, where denotes the total electron angular momentum and denotes the nuclear spin. Usually, is of order of GHz; the hyperfine splitting is orders of magnitude smaller perturbation than the fine structure. In a more advanced treatment, one also has to take the nuclear magnetic quadrupole moment into account. This is sometimes (?) referred to as "hyperfine structure anomaly". The optical hyperfine structure was already observed in 1881 by Albert Abraham Michelson . It could, however, only be explained in terms of quantum mechanics in the 1920s. Wolfgang Pauli proposed the existence of a small nuclear magnetic moment in 1924. In 1935, M. Schiiler and T. Schmidt proposed the existence of a nuclear quadrupole moment in order to explain anomalies in the hyperfine structure. Hyperfine interactions can be measured, among other ways, in atomic and molecular spectra and in electron paramagnetic resonance spectra of free radicals and transition-metal ions. As the hyperfine splitting is very small, the transition frequencies usually are not optical, but in the range of radio- or microwave frequencies. Hyperfine structure gives the 21 cm line observed in HI region in interstellar medium. Carl Sagan and Frank Drake considered the hyperfine transition of hydrogen to be a sufficiently universal phenomenon so as to be used as a base unit of time and length on the Pioneer plaque and later Voyager Golden Record. In radio astronomy, heterodyne receivers are widely used in detection of the electromagnetic signals from celestial objects. The separations among various components of a hyperfine structure are usually small enough to fit into the receiver's IF band. Because optical depth varies with frequency, strength ratios among the hyperfine components differ from that of their intrinsic intensities. From this we can derive the object's physical parameters. process uses the hyperfine splitting of between optical transitions in uranium-235 and uranium-238 to selectively photoionize only the uranium-235 atoms and then separate the ionized particles from the non-ionized ones. Precisely tuned dye lasers are used as the sources of the necessary exact wavelength radiation. Use in defining the SI second and meter The hyperfine structure transition can be used to make a microwave notch filter with very high stability, repeatability and Q factor , which can thus be used as a basis for very precise atomic clocks . Typically, the hyperfine structure transition frequency of a particular isotope of caesium atoms is used as a basis for these clocks. Due to the accuracy of hyperfine structure transition-based atomic clocks, they are now used as the basis for the definition of the second. One second is now defined to be exactly 9,192,631,770 cycles of the hyperfine structure transition frequency of caesium-133 atoms. Since 1983, the meter is defined by declaring the speed of light in a vacuum to be exactly 299,792,458 metres per second. Thus: The metre is the length of the path travelled by light in vacuum during a time interval of 1/299 792 458 of a second. Precision tests of quantum electrodynamics The hyperfine splitting in hydrogen and in muonium have been used to measure the value of the fine structure constant α. Comparison with measurements of α in other physical systems provides a stringent test of QED Qubit in ion-trap quantum computing The hyperfine states of a trapped ion are commonly used for storing qubits in ion-trap quantum computing . They have the advantage of having very long lifetimes, experimentally exceeding ~10 min (compared to ~1 s for metastable electronic levels). The frequency associated with the states' energy separation is in the microwave region, making it possible to drive hyperfine transitions using microwave radiation. However, at present no emitter is available that can be focused to address a particular ion from a sequence. Instead, a pair of laser pulses can be used to drive the transition, by having their frequency difference (detuning) equal to the required transition's frequency. This is essentially a stimulated Raman transition. - G. Herzberg, Atomic Spectra and Atomic Structure. Dover, New York, 1944. See especially chapter 5. - M. Symons, Chemical and Biochemical Aspects of Electron-Spin Resonance Spectroscopy. Wiley, New York, 1978 - J. A. Weil, J. R. Bolton, and J. E. Wertz, Electron Paramagnetic Resonance: Elementary Theory and Practical Applications. Wiley-Interscience, New York, 2001
<urn:uuid:80f403ac-1315-4162-96e9-60c07bb9c6b0>
CC-MAIN-2014-35
http://www.reference.com/browse/hyperfine+structure
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500835872.63/warc/CC-MAIN-20140820021355-00286-ip-10-180-136-8.ec2.internal.warc.gz
en
0.895858
1,478
3.796875
4
(Phys.org) —While quantum states are typically referred to as particles or waves, this is not actually the case. Rather, quantum states have complementary discrete particlelike and continuous wavelike properties that emerge based on the experimental or observational context. In other words, when used to describe quantum states the terms particle and wave are convenient but inaccurate metaphors. This is an important consideration in quantum computing, where photons are used as units of quantum information known as quantum bits, or qubits, which due to quantum superposition (and therefore unlike classical bits) can simultaneously exist in two states. That said, current attempts to devise quantum computers that process photonic qubits universally using particle detectors to count photons and optical circuits to capture quantum wave evolution have been stymied by the fact that ancilla states – fixed qubit states used in reversible quantum computing as input to a gate to give that gate a more specific logic function – consist of many highly-entangled photons, thereby exceeding experimental capabilities. (Entanglement is a uniquely-quantum state in which two or more interacting particles are said to be hypercorrelated – meaning that the state of each individual particle cannot be described independently, and that a change in a property of one particle is instantly reflected in its entangled partner regardless of the distance separating them.) Recently, however, scientists at The University of Tokyo demonstrated for the first time a two-way conversion between a particlelike single-photon state and a wavelike superposition of coherent states by applying quantum squeezing/unsqueezing as a quantum gate, deriving Gaussian (coherent) operations that are applicable to nonclassical, non-Gaussian quantum states and therefore expanding the hybrid quantum-information processing optical toolbox. (In general, a squeezed coherent state is a quantum state in which the uncertainty principle is saturated. Achieved using a number of methods1, squeezed light is a state in which quantum noise is reduced. Specifically, in a squeezed sate the electric field noise paradoxically falls below that of the vacuum state – a phenomenon that has classical counterpart.) Moreover, the researchers say that their so-called squeezing gate will lead to new applications while forming the basis of a new class of optical quantum processors capable of integrating particlelike and wavelike quantum states. Prof. Akira Furusawa discussed the paper that he and his co-authors published in Physical Review Letters with Phys.org – including the main challenges in successfully applying a quantum optical squeezing operation upon non-Gaussian quantum states, thereby demonstrating a two-way conversion between a particlelike single-photon state and a wavelike superposition of coherent states. "Previous approaches using direct squeezing operations for nonclassical non-Gaussian states were very difficult because such states are very fragile to losses – and direct squeezing operations inevitably have losses," Furusawa tells Phys.org. "In our approach, the squeezing operation is not direct. Instead, we first prepare a squeezed vacuum by using a conventional optical parametric oscillator and then teleport the squeezing operation to fragile nonclassical non-Gaussian states through linear optics, which have almost no losses." In quantum teleportation2, qubits (specifying, for example, a photon's precise state) are transmitted between quantum-entangled locations via classical communication systems. "In this case, the essential resource is entanglement between the ancillary squeezed vacuum and nonclassical non-Gaussian states, which are created by a beam splitter with no losses," Furusawa notes. "Our successful teleportation of the squeezing operation to a single-photon state and Schrödinger's-cat" – that is, superposition – "state is the first example of deterministic quantum gate teleportation." Another first the researchers achieved was using universal and reversible low-loss broadband squeezing to access for the first time a complete set of deterministic Gaussian operations applicable to nonclassical, non-Gaussian states. "A complete set of deterministic Gaussian operations consists of displacement, rotation, and squeezing in phase space," Furusawa explains. "Displacement can be realized by using an optical modulator and a beam splitter, and rotation by controlling optical path length. Therefore, both operations are very easy to apply – even to nonclassical non-Gaussian states. The last piece of the complete set is squeezing, where we succeeded – also for the first time." In short, the scientists' key result – demonstrating the very powerful capability of deterministic quantum gate teleportation – allows non-Gaussian operations that can, in principle, be used to build the elusive universal quantum computer. "We want to hybridize the discrete and continuous quantum protocols to build an efficient and robust quantum computer," Furusawa confirms. "The advantage of using qubit protocols is the robustness coming from the digital processing-like finite dimensionality, while the advantage of continuous-variable protocols is efficiency, because they can allow us to make deterministic operations. (Furusawa points out that it remains an open question if this hybridization has implications for ongoing efforts to integrate quantum mechanics and general relativity, which are described using discrete and continuous mathematics, respectively.) The paper also describes the notable finding that allows the entire Fock space to be used when processing single photons, thereby possibly helping to construct quantum gates and error correction codes for logical qubits. (The Fock space is a mathematical method for articulating the quantum states of a variable, or a non-specified number of identical particles, from a single particle Hilbert space, which is a generalization of Euclidean space.) "Specifically," says Furusawa, "we're now thinking about constructing quantum gates and error correction codes with the hybrid protocol." Furusawa adds that the deterministic Gaussian operations accessed made possible by their broadband squeezer will directly lead to applications in this area. "Firstly," he illustrates, "we can construct a quantum non-demolition (QND) gate – in which a measured observable's uncertainty does not increase as the quantum system evolve – that corresponds to a qubit controlled NOT (CNOT) gate." Quantum CNOT gates can be used to simulate any quantum circuit to an arbitrary degree of accuracy, as well as to create and dismantle entangled, or EPR (after the 1935 paper3 by Albert Einstein, Boris Podolsky and Nathan Rosen), states. "Secondly, since the QND gate is a universal entangling gate, it allows more complicated quantum gate teleportation." In addition, Furusawa tells Phys.org, their next target is a particlelike/wavelike hybrid CNOT gate based on non-Gaussian quantum gate teleportation. "We're also thinking about applying this technology to optical communications – especially a quantum mechanically optimal receiver." Explore further: Entanglement between particle and wave-like states of light resembles Schrodinger's cat experiment (Update) More information: Exploring a New Regime for Processing Optical Qubits: Squeezing and Unsqueezing Single Photons, Physical Review Letters 113, 013601 (Published 2 July 2014), doi:10.1103/PhysRevLett.113.013601 1Squeezed light, arXiv:1401.4118v1 [quant-ph] 2Quantum Teleportation and Entanglement by Akira Furusawa and Peter van Loock, Wiley-VCH (2011), ISBN-13:978-3527409303 (Hardcopy), ASIN:B00BP7S3X8 (Kindle), ISBN:9783527635306 (Google EBook) 3Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? Physical Review 47, 777 (15 May 1935), doi:10.1103/PhysRev.47.777
<urn:uuid:b2bb4fce-dd5b-47d2-bc00-8f7eb3a05ab0>
CC-MAIN-2014-35
http://phys.org/news/2014-07-particle-optical-qubit-technique-photons.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500824970.20/warc/CC-MAIN-20140820021344-00419-ip-10-180-136-8.ec2.internal.warc.gz
en
0.903843
1,596
3.640625
4
Rydberg atoms are atoms in which one or more of the atom's electrons have been excited into very high energy states. Because the Rydberg electron is so far from the core of the atom, the atom develops exaggerated properties, such as hugh polarizabilities that scale like n7, where n is the principle quantum number. These exaggerated properties lead to strong, tunable interactions among the atoms, which have applications in many different fields of physics. One of the most important consequences of the strong interactions between Rydberg atoms is the Rydberg excitation blockade, which results from the interactions shifting the energy levels of the atoms. As shown in the figure above, the energy levels deviate from an equidistant ladder. If the shift of the second excited state is great enough such that the excitation laser is out of resonance with the state, then all excitation above the first excitated state is blockaded. Some of the applications of the Rydberg excitation blockade include quantum computation, quantum cryptography, improved spectroscopic resolution, and atomic clocks. The first proposal to use the blockade for quantum information was in 2000, when Jaksch et. al. suggested a method of generating a fast phase gate (Phys. Rev. Lett. 85, 2208 (2000)) using Rydberg atoms. The motivation discusses this proposal. As we move toward the goal of quantum computing with Rydberg atoms, we have conducted many interesting studies. We highlight work done converning the Autler-Townes effect with 85Rb. By taking advantage of the long lifetimes of Rydberg atoms (10's of microseconds), and hence small spectroscopic linewidths of Rydberg states, we are able to achieve Autler-Townes spectra with high resolution. These measurements provide a foundation for all later work, as Autler-Townes spectroscopy is a tool for measuring Rabi frequencies with high accuracy. We have also conducted a spectroscopic measurement of the energy shifts of the second excited state of the Rydberg excitation ladder in different interaction regimes. By applying two sets of excitation pulses with variable frequency (a set because the excitation to Rydberg states is a two-photon excitation), we have measured the lineshape of the 1R - 2R transition. This study is the first spectroscopic proof of the functionality of the Rydberg excitation blockade. One way of measuring the effectiveness of the Rydberg excitation blockade is to use counting statistics. We have used this method for a range of nD5/2 Rubidium Rydberg states. Counting statistics measurements are particularly useful for measuring blockade effectiveness in small atomic samples and for a variety of different experimental parameters such as excitation Rabi frequencies, detuning, and quantum state. All atoms have repulsive or attractive forces between them due to temporary dipole moments, when the electrons of an atom leave the positively charged nucleus unshielded. Typically the positively charged nucleus polarizes (induces a dipole) in nearby atoms causing a temporary dipole-dipole interaction. These temporary off-resonant dipole-dipole interactions are typically named "van der Waals" or "London" forces. Two atoms or molecules with permanent dipole moments, e.g. HCl, interact via on-resonant "dipole-dipole" interactions. These permanent dipole-dipole interactions however are always in addition to van der Waals (temporary dipole-dipole) interactions. The similarity between dipole-dipole and van der Waals interactions is often clouded by the naming convention. They are both calculated using the standard interaction potential of two interacting dipoles. Van der Waals are off-resonant, temporary, second-order interactions, and dipole-dipole interactions are on-resonant, permanent, first-order interactions. Interatomic van der Waals interactions are present in all matter, and play a large role in determining the melting points of all elements. For example, consider the melting point of He, 4K (-269C), as compared to the melting temperature of Radon at 221K (-52C). In general symmetric atoms like He and Radon must first be cooled down significantly in order to condense, because they cannot align themselves into an array of aligned dipoles as effectively as elliptically shaped atoms. Atoms with more electrons like Radon have larger van der Waals interactions, and thus must be heated more to break the van der Waals bonds and become a gas. This is because the electrons have larger orbits away from the nucleus, leaving the nucleus unexposed with a higher probability. Furthermore, the nucleus of heavier atoms will induce larger dipoles in nearby atoms, and hence larger van der Waals interactions. As briefly mentioned above, there are two interaction regimes for the forces between atoms: the van der Waals regime, and the dipole-dipole regime. We can see how the two regime arise by looking at the Hamiltonian for two particle interactions, shown on the right. Generically, the Hamiltonian contains energies on the diagonal and coupling terms on the off-diagonal. Here, we have a two particle state AA that is coupled to another two particle state BC through an interaction term Vint, and an energy detuning of D. In our case, the interaction term Vint, is the dipole interaction operator. The scaling of this operator is n4/R3. In the regime of van der Waals interactions, the coupling between the atoms is much less than the energy detuning, D, of the interaction. This leads to energy eigenstates that are shifted in energy by (Vint)2/D. Since D scales like 1/n3, the total scaling of the shift is n11/R6. Conversely, for dipole-dipole interactions, the energy detuning D is much smaller than the interaction Vint. In this case, the scaling is simply n4/R3.
<urn:uuid:5cb33a2e-87bc-4600-9d03-03c9a17a4f6f>
CC-MAIN-2014-35
http://cold-atoms.physics.lsa.umich.edu/projects/dipoleblockade/blockade.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500835822.36/warc/CC-MAIN-20140820021355-00244-ip-10-180-136-8.ec2.internal.warc.gz
en
0.914637
1,257
3.65625
4
Diamonds have long been available in pairs—say, mounted in a nice set of earrings. But physicists have now taken that pairing to a new level, linking two diamonds on the quantum level. A group of researchers report in the December 2 issue of Science that they managed to entangle the quantum states of two diamonds separated by 15 centimeters. Quantum entanglement is a phenomenon by which two or more objects share an unseen link bridging the space between them—a hypothetical pair of entangled dice, for instance, would always land on matching numbers, even if they were rolled in different places simultaneously. But that link is fragile, and it can be disrupted by any number of outside influences. For that reason entanglement experiments on physical systems usually take place in highly controlled laboratory setups—entangling, say, a pair of isolated atoms cooled to nearly absolute zero. In the new study, researchers from the University of Oxford, the National Research Council of Canada and the National University of Singapore (NUS) showed that entanglement can also be achieved in macroscopic objects at room temperature. "What we have done is demonstrate that it's possible with more standard, everyday objects—if diamond can be considered an everyday object," says study co-author Ian Walmsley, an experimental physicist at Oxford. "It's possible to put them into these quantum states that you often associate with these engineered objects, if you like—these closely managed objects." To entangle relatively large objects, Walmsley and his colleagues harnessed a collective property of diamonds: the vibrational state of their crystal lattices. By targeting a diamond with an optical pulse, the researchers can induce a vibration in the diamond, creating an excitation called a phonon—a quantum of vibrational energy. Researchers can tell when a diamond contains a phonon by checking the light of the pulse as it exits. Because the pulse has deposited a tiny bit of its energy in the crystal, one of the outbound photons is of lower energy, and hence longer wavelength, than the photons of the incoming pulse. Walmsley and his colleagues set up an experiment that would attempt to entangle two different diamonds using phonons. They used two squares of synthetically produced diamond, each three millimeters across. A laser pulse, bisected by a beam splitter, passes through the diamonds; any photons that scatter off of the diamond to generate a phonon are funneled into a photon detector. One such photon reaching the detector signals the presence of a phonon in the diamonds. But because of the experimental design, there is no way of knowing which diamond is vibrating. "We know that somewhere in that apparatus, there is one phonon," Walmsley says. "But we cannot tell, even in principle, whether that came from the left-hand diamond or the right-hand diamond." In quantum-mechanical terms, in fact, the phonon is not confined to either diamond. Instead the two diamonds enter an entangled state in which they share one phonon between them. To verify the presence of entanglement, the researchers carried out a test to check that the diamonds were not acting independently. In the absence of entanglement, after all, half the laser pulses could set the left-hand diamond vibrating and the other half could act on the right-hand diamond, with no quantum correlation between the two objects. If that were the case, then the phonon would be fully confined to one diamond. If, on the other hand, the phonon were indeed shared by the two entangled diamonds, then any detectable effect of the phonon could bear the imprint of both objects. So the researchers fired a second optical pulse into the diamonds, with the intent of de-exciting the vibration and producing a signal photon that indicates that the phonon has been removed from the system. The phonon's vibrational energy gives the optical pulse a boost, producing a photon with higher energy, or shorter wavelength, than the incoming photons and eliminating the phonon in the process. Once again, there is no way of knowing which diamond produced the photon, because the paths leading from each diamond to the detectors are merged, so there is no way of knowing where the phonon was. But the researchers found that each of the photon paths leading from the diamonds to the detectors had an interfering effect on the other—adjusting how the two paths were joined affected the photon counts in the detectors. In essence, a single photon reaching the detectors carried information about both paths. So it cannot be said to have traveled down one path from one diamond: the photon, as with the vibrational phonon that produced it, came from both diamonds. After running the experiment over and over again to gather statistically significant results, the researchers concluded with confidence that entanglement had indeed been achieved. "We can't be 100 percent certain that they're entangled, but our statistical analysis shows that we're 98 percent confident in that, and we think that's a pretty good outcome," Walmsley says. The catch to using phonons for macroscopic entanglement is that they do not last long—only seven picoseconds, or seven trillionths of a second, in diamond. So the experimenters had to rely on extremely fast optical pulses to carry out their experiment, creating entangled states with phonons and then damping the phonons with the second pulse to test that entanglement just 0.35 picoseconds later. Because of this brevity, such entanglement schemes may not take over for more established techniques using photons or single atoms, but Walmsley hopes that researchers will consider the possibilities of using fairly ordinary, room-temperature materials in quantum technologies. "I think it gives a new scenario and a new instantiation of something that helps point in that direction," he says. Indeed, the new study is just the latest to show how quantum mechanics applies in real-world, macroscopic systems. Oxford and NUS physicist Vlatko Vedral, who was not involved in the new research, says it "beautifully illustrates" the point of Austrian physicist Erwin Schrödinger's famous thought experiment in which a hypothetical cat is simultaneously alive and dead. "It can't be that entanglement exists at the micro level (say of photons) but not at the macro level (say of diamonds)," because those worlds interact, Vedral wrote in an email. "Schrödinger used atoms instead of photons and cats instead of diamonds, but the point is the same."
<urn:uuid:8efb9806-3b82-4329-9f88-5f83daf755dc>
CC-MAIN-2014-35
http://www.scientificamerican.com/article/room-temperature-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1409535921957.9/warc/CC-MAIN-20140901014521-00060-ip-10-180-136-8.ec2.internal.warc.gz
en
0.95356
1,340
3.609375
4
Causality is one of the oldest and most important concepts of Physics. Even recently, at the beginning of the XX century, with the invention of Special Relativity, this concept was in some sense rediscovered. As in a relativistic framework the events can change their temporal order a great effort was made in order to preserve causality in the theory. There is a general consensus in the scientific community about this concept: For all scientific theories, even for all the theories that will come in the future, causality should be preserved. If causal relations are broken an important number of paradoxes and counter-intuitive results arise. You could even go back in time and kill your grandgrandfather! In quantum mechanics the discovery of entangled states, that are states with correlations than can act immediately even in they are separated by a distance of millions of light years, challenged this concept. The solution for preserving causality was to accept that quantum systems are intrinsically random and no theory can give a complete description of them. Very recently, in Reference 1, a paper published in Nature Communications by Ognyan Oreshkov and coworkers, from the University of Vienna, the concept of causality itself is discussed. Just by assuming that quantum mechanics is valid only locally, they show that it is difficult to talk about ‘causal order’. As it has been made before in order to analyze the effects of quantum mechanics the authors decided to illustrate their result with a thought experiment. The rules of this experiment are: - There are two parties, Alice and Bob. They are in labs that are far away from each other. - They both receive one random bit, either 0 or 1. - They can send information out between their labs. - They have to guess the bit of each other. This decision should be made at the same time they send their information out. Obviously, the experiment should be repeated several times, and the goal is to guess the bit of the other party as much times as possible. The ‘figure of merit’ that measures how well we are performing the game is the probability of guessing for both Alice and Bob together, that is a number between 0 and 1. Let see what can we do in a classical, and causal, framework. It is clear that the success probability will depend in this case on the time order of the events. If Alice sends her information first, she can use it in order to communicate Bob what her bit was. Indeed, Bob will succeed all the time. The problem now is that Alice has no clue about Bob’s bit, so the best she can do is just say something at random. The same problem arises if it is Bob the first in sending the information. So, in the best possible scenario, the probability of success is 1 for one of them, the one that acts second, and ½ for the other one, the one that acts first. That means that the best possible probability in a classical causal framework is ¾. So, is there any difference in a quantum mechanics framework? Not really, quantum mechanics is also a theory with a definite causal background and has to fulfill the same constrains. But, what happens if we slightly modify quantum mechanics in order to remove the space-time background, making it only valid locally, but not globally? That is the problem analyzed in Ref. 1 by Oreshkov et al. There, the authors performed a similar experiment, where it is assumed that Alice and Bob can make any kind of quantum operation in their labs. In these labs quantum mechanics holds, but there is not any assumption of a preexisting background time, or global causal structure. In this scenario, that differs from normal quantum mechanics, they show that the limit of the probability of success can be enhanced beyond the causal limit. The rules for the non-causal quantum game are: - Each laboratory is isolated. - Quantum mechanics can be applied locally in the labs, but there is no assumption of what happens globally. - There is also no assumptions about the spatio-temporal location of the experiments. That means that it is not define who makes the measurement before. - They don’t need to communicate in this case. This is a necessary assumption in this case, because in this case there is not a definite spatio-temporal order, so it is not defined who acts first and can communicate and who is second and can not. Based on these assumptions the authors create a new framework based on local quantum mechanics for analyzing the possible options of Alice and Bob. The results are surprising, they find a possibility of reaching a success probability of 0,853, that is higher than the ¾ probability of the best causal scenario. Even, without communication between them. And what does it mean? Is causality broken in this new theory and we can communicate now with our dead grandgrandfather? That could be very interesting for science fiction writers, but it is not like that. The authors claim in their paper that, as quantum mechanics can be applied locally to Alice and Bob’s labs, causality should be preserved. This is due to the noise in the evolution ‘backward in time’ and it is compatible with the Novikov principle. So, if causality itself is not broken, why is this result interesting? First, the analysis of new possible frameworks is always useful. In general relativity, for instance, when one imposes only local constrains new and interesting features arise, as exotic causal structures. It looks like that something similar happens in the quantum regime. Also, this results imply that if quantum mechanics only works locally new kind of correlations appear, stronger than the ones that are usual in normal quantum mechanics, like entanglement. Even, if these correlations can not break the causal order, as is expectable, the potential implications are huge. We should not forget that entanglement leads to interesting applications as quantum computing, quantum teleportation or cryptography. We can not know which applications these new correlations may have. Finally, there is a more important question: Are these correlations something real or just a mathematical trick? About this question, the authors mention in the discussion of their paper that maybe these correlations can be found in regimes where the actual theories are untested, such as, for example, those in which quantum mechanics and general relativity become relevant. So, in my opinion, for the moment this result is purely theoretical, but very interesting in any case. This kind of studies, even if they are just theory, usually open a door to new ways of thinking. Also new theories and potential applications can be realized from it. Only time can show how useful it will be.
<urn:uuid:195d83ff-8b76-4d44-96c9-b50278dacbe9>
CC-MAIN-2014-35
http://mappingignorance.org/2012/12/04/quantum-correlations-with-no-causal-order/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500825567.38/warc/CC-MAIN-20140820021345-00055-ip-10-180-136-8.ec2.internal.warc.gz
en
0.951041
1,372
3.671875
4
Many important problems in physics—especially low-temperature physics—remain poorly understood because the underlying quantum mechanics is vastly complex. Conventional computers—even supercomputers—are inadequate for simulating quantum systems with as few as 30 particles. Better computational tools are needed to understand and rationally design materials, such as high-temperature superconductors, whose properties are believed to depend on the collective quantum behavior of hundreds of particles. The NIST quantum simulator permits study of quantum systems that are difficult to study in the laboratory and impossible to model with a supercomputer. The heart of the simulator is a two-dimensional crystal of beryllium ions (blue spheres in the graphic); the outermost electron of each ion is a quantum bit (qubit, red arrows). The ions are confined by a large magnetic field in a device called a Penning trap (not shown). Inside the trap the crystal rotates clockwise. Nature - Engineered two-dimensional Ising interactions in a trapped-ion quantum simulator with hundreds of spins The NIST quantum simulator permits study of quantum systems that are difficult to study in the laboratory and impossible to model with a supercomputer. In this photograph of the crystal, the ions are fluorescing, indicating the qubits are all in the same state. Under the right experimental conditions, the ion crystal spontaneously forms this nearly perfect triangular lattice structure. The NIST simulator consists of a tiny, single-plane crystal of hundreds of beryllium ions, less than 1 millimeter in diameter, hovering inside a device called a Penning trap. The outermost electron of each ion acts as a tiny quantum magnet and is used as a qubit—the quantum equivalent of a “1” or a “0” in a conventional computer. In the benchmarking experiment, physicists used laser beams to cool the ions to near absolute zero. Carefully timed microwave and laser pulses then caused the qubits to interact, mimicking the quantum behavior of materials otherwise very difficult to study in the laboratory. Although the two systems may outwardly appear dissimilar, their behavior is engineered to be mathematically identical. In this way, simulators allow researchers to vary parameters that couldn’t be changed in natural solids, such as atomic lattice spacing and geometry. In the NIST benchmarking experiments, the strength of the interactions was intentionally weak so that the simulation remained simple enough to be confirmed by a classical computer. Ongoing research uses much stronger interactions. Simulators exploit a property of quantum mechanics called superposition, wherein a quantum particle is made to be in two distinct states at the same time, for example, aligned and anti-aligned with an external magnetic field. So the number of states simultaneously available to 3 qubits, for example, is 8 and this number grows exponential with the number of qubits: 2N states for N qubits. Crucially, the NIST simulator also can engineer a second quantum property called entanglement between the qubits, so that even physically well separated particles may be made tightly interconnected. Recent years have seen tremendous interest in quantum simulation; scientists worldwide are striving to build small-scale demonstrations. However, these experiments have yet to fully involve more than 30 quantum particles, the threshold at which calculations become impossible on conventional computers. In contrast, the NIST simulator has extensive control over hundreds of qubits. This order of magnitude increase in qubit-number increases the simulator’s quantum state space exponentially. Just writing down on paper a state of a 350-qubit quantum simulator is impossible—it would require more than a googol of digits: 10 to the power of 100. Over the past decade, the same NIST research group has conducted record-setting experiments in quantum computing,** atomic clocks and, now, quantum simulation. In contrast with quantum computers, which are universal devices that someday may solve a wide variety of computational problems, simulators are “special purpose” devices designed to provide insight about specific problems. The presence of long-range quantum spin correlations underlies a variety of physical phenomena in condensed-matter systems, potentially including high-temperature superconductivity. However, many properties of exotic, strongly correlated spin systems, such as spin liquids, have proved difficult to study, in part because calculations involving N-body entanglement become intractable for as few as N ≈ 30 particles. Feynman predicted that a quantum simulator—a special-purpose ‘analogue’ processor built using quantum bits (qubits)—would be inherently suited to solving such problems. In the context of quantum magnetism, a number of experiments have demonstrated the feasibility of this approach but simulations allowing controlled, tunable interactions between spins localized on two- or three-dimensional lattices of more than a few tens of qubits have yet to be demonstrated, in part because of the technical challenge of realizing large-scale qubit arrays. Here we demonstrate a variable-range Ising-type spin–spin interaction, Ji,j, on a naturally occurring, two-dimensional triangular crystal lattice of hundreds of spin-half particles (beryllium ions stored in a Penning trap). This is a computationally relevant scale more than an order of magnitude larger than previous experiments. We show that a spin-dependent optical dipole force can produce an antiferromagnetic interaction , where 0 ≤ a ≤ 3 and di,j is the distance between spin pairs. These power laws correspond physically to infinite-range (a = 0), Coulomb–like (a = 1), monopole–dipole (a = 2) and dipole–dipole (a = 3) couplings. Experimentally, we demonstrate excellent agreement with a theory for 0.05 ≲ a ≲ 1.4. This demonstration, coupled with the high spin count, excellent quantum control and low technical complexity of the Penning trap, brings within reach the simulation of otherwise computationally intractable problems in quantum magnetism. If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks
<urn:uuid:9d41b2e7-c6a9-4ac0-b1e4-5e8564601574>
CC-MAIN-2014-35
http://nextbigfuture.com/2012/04/nist-physicists-benchmark-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500835670.21/warc/CC-MAIN-20140820021355-00201-ip-10-180-136-8.ec2.internal.warc.gz
en
0.908314
1,251
4.125
4
Diamonds to dust One aim of future research is to ultimately confine the light interaction to the atomic scale and to demonstrate selective single atom removal. Credit: Carlo Bradac Photo: Carlo Bradac Small, it seems, is never quite small enough. In their relentless quest to build ever-more minuscule and compact electronic devices, scientists have attempted to manipulate a variety of materials down to the atomic level. For many reasons, this has proved tough to achieve. Now, a team of Australian researchers has succeeded in using intense pulses of laser light to move individual atoms in substances as rock-solid as diamonds. The breakthrough is likely to lead to new types of nano-scale devices measuring just billionths of a metre, including minute sensors, super-small and fast electronic components and data storage systems, quantum computers and perhaps a new generation of high-powered lasers on tiny chips. The discovery, reported in the British journal Nature Communications, resulted more from serendipity than planning. "To our surprise, we found that ultraviolet lasers could be used to target specific atoms," says team leader Richard Mildren, of Macquarie University in Sydney. "We knew that UV lasers could eject atoms from the surface of diamonds – even at very low light levels," Associate Professor Mildren explains. "But there was no clue to suggest that this process could be harnessed to remove a single targeted atom." The telling clue, he says, came from ongoing research using an intense UV laser to slice through small sections of diamond. Diamonds derive their hardness from the way their carbon atoms are arranged in an extremely rigid grid, known as a crystal lattice. The rigidity results from each atom being bound tightly to four other carbon atoms. Although diamonds are generally transparent to UV rays, a sliver of the light is absorbed very close to the surface. "We think it may occur in the top one or two rows of atoms," Professor Mildren says. "The surface of diamond is normally covered in oxygen atoms and we suspect the carbon is released in the form of carbon monoxide molecules." The added energy, he says, is enough to break the chemical bonds that normally bind carbon atoms to the surface. The scientists found that it takes the energy of two UV light particles, or photons, to dislodge one carbon atom. "Carbon atoms are ejected from the surface one by one," he says. "The rate at which this happens is very predictable." Exactly how the energy is absorbed, and leads to the bonds being broken, is not yet well understood. "This is something we need to work on." Not any old light does the trick. The diamonds his team worked on were exposed to a very specific form of light pulses in the UV-C band. These are the sun's harshest rays that are largely filtered out by Earth's ozone layer. A few seconds after being bombarded with light pulses, the diamonds developed small pits on their surface. "The rate of mass loss in the diamond fell notably for lower light levels," Professor Mildren says. "But the etching process still continued – albeit at a slower and slower pace." The rate of this etching is so slow that it is not noticeable under normal circumstances. In fact, even under very bright conditions, such as intense sunlight or a UV tanning lamp, it would take roughly the age of the universe – almost 14 billion years – to make an appreciable impact on a diamond. This is where lasers come in handy. These are basically devices that emit intense beams of light by amplifying the light waves using a process called stimulated emission of electromagnetic radiation. The term laser, in fact, is an acronym for "light amplification by stimulated emission of radiation". Beams emitted in this way differ from other sources of light because they emit the light coherently. In essence, this means that a "hot" beam can be intensified and concentrated onto a very small area. This allows them to cut or weld through virtually any solid material. The laser's ability to cut out components on dimensions much smaller than the width of a human hair makes them prized tools in high-tech industries, including electronics and car making. But at smaller scales – such as the distances between atoms – lasers were, until recently, generally considered to be quite ineffectual. The problem, Professor Mildren says, is that laser cutting and material processing have relied on the heat produced by a laser's beam, in many cases stripping electrons from their parent atomic nuclei. The smallest cuts that could be made depended on the amount of heat transferred to the surface. "There are now promising signs that it is possible to use lasers to carve up a material with atomic resolution – that is to pick apart a substance atom by atom by using a light beam to snip the chemical bonds holding the individual atoms together," he explains. The researchers have experimented with their lasers, for example machining a variety of surfaces. Examination of the machined surfaces using a high-powered electron microscope showed the formation of a curious pattern of regular nano-structures, Professor Mildren says. "The key observation came when varying the light beam's polarisation – that is, the direction of the light wave's oscillating movement. The particular shape and orientation of these patterns altered with the way chemical bonds of surface atoms lined up with the polarisation." This surprising observation, he says, provided the essential clue that the light was somehow interacting with individual bonds. "It also showed that chemical bonds can be broken before there is any significant dissipation of energy to cause damage to the surrounding area." Low-cost production of high-quality diamonds from synthetic sources is driving developments in areas such as ultra-fast electronics, quantum computing devices and miniature high-powered diamond lasers. "Having a new tool to construct and manipulate diamond devices at the ultimate level of resolution is very exciting for developing these future technologies," Professor Mildren says. "We have already shown that it's possible to make diamond structures of less than 20 nanometres – within the size range of large molecules. This is many tens of times smaller than what could previously be achieved, and suitably small to be of immediate use in applications such as super-low friction surfaces and advanced light sources." The next goal, he says, is to develop ways to treat single or small groups of atoms. "We would like to manipulate surfaces with single-atom precision, or more than 10,000 times smaller than that possible by standard laser machining techniques. This is an area full of interesting challenges for confining a laser beam sufficiently to gain the necessary level of control." Professor Mildren and colleagues Andrew Lehmann and Carlo Bradac admit the mechanisms behind this process are not yet well understood. "So it is important to study the process in greater detail, asking such questions as: how is the light absorbed? And: how are the chemical bonds broken without significant leakage of energy into surrounding areas?" That this effect was first detected in diamonds is no coincidence, Professor Mildren says. "Although they have been known for thousands of years, diamonds are only now gaining true importance in science and technology. They have very highly defined bonds that are relatively disconnected from neighbouring atoms. So another key question is this: how many materials other than diamond can we laser-pick apart like this? And what might be the consequences?" Discover more about UV light and how it relates the electromagnetic spectrum at: http://science.hq.nasa.gov/kids/imagers/ems/uv.html Please send bright ideas for new topics to firstname.lastname@example.org
<urn:uuid:6a87e333-1f7a-475c-966e-1b2088e19478>
CC-MAIN-2014-35
http://www.theage.com.au/national/education/brilliance-in-diamond-dust-20140314-34qjl.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500829754.11/warc/CC-MAIN-20140820021349-00365-ip-10-180-136-8.ec2.internal.warc.gz
en
0.95406
1,564
3.765625
4
|This article needs additional citations for verification. (December 2009)| The Bell states are a concept in quantum information science and represent the most simple examples of entanglement. They are named after John S. Bell because they are the subject of his famous Bell inequality. An EPR pair is a pair of qubits which are in a Bell state together, that is, entangled with each other. Unlike classical phenomena such as the nuclear, electromagnetic, and gravitational fields, entanglement is invariant under distance of separation and is not subject to relativistic limitations such as the speed of light[vague]. The Bell states The degree to which a state is entangled is monotonically measured by the Von Neumann entropy of the reduced density operator of a state. The Von Neumann entropy of a pure state is zero - also for the bell states which are specific pure states. But the Von Neumann entropy of the reduced density operator of the Bell states is maximal In order to explain this, it is important to first look at the Bell state : This expression means the following: The qubit held by Alice (subscript "A") can be 0 as well as 1. If Alice measured her qubit in the standard basis the outcome would be perfectly random, either possibility having probability 1/2. But if Bob then measured his qubit, the outcome would be the same as the one Alice got. So, if Bob measured, he would also get a random outcome on first sight, but if Alice and Bob communicated they would find out that, although the outcomes seemed random, they are correlated. So far, this is nothing special: maybe the two particles "agreed" in advance, when the pair was created (before the qubits were separated), which outcome they would show in case of a measurement. Hence, followed Einstein, Podolsky, and Rosen in 1935 in their famous "EPR paper", there is something missing in the description of the qubit pair given above—namely this "agreement", called more formally a hidden variable. But quantum mechanics allows qubits to be in quantum superposition—i.e. in 0 and 1 simultaneously—that is, a linear combination of the two classical states—for example, the states or . If Alice and Bob chose to measure in this basis, i.e. check whether their qubit were or , they would find the same correlations as above. That is because the Bell state can be formally rewritten as follows: Note that this is still the same state. John S. Bell showed in his famous paper of 1964 by using simple probability theory arguments that these correlations cannot be perfect in case of "pre-agreement" stored in some hidden variables—but that quantum mechanics predict perfect correlations. In a more formal and refined formulation known as the Bell-CHSH inequality, this would be stated such that a certain correlation measure cannot exceed the value 2 according to reasoning assuming local "hidden variable" theory (sort of common-sense) physics, but quantum mechanics predicts . There are three specific other states of two qubits which are also regarded as Bell states and which lead to this maximal value of . The four are known as the four maximally entangled two-qubit Bell states: Bell state measurement The Bell measurement is an important concept in quantum information science: It is a joint quantum-mechanical measurement of two qubits that determines which of the four Bell states the two qubits are in. If the qubits were not in a Bell state before, they get projected into a Bell state (according to the projection rule of quantum measurements), and as Bell states are entangled, a Bell measurement is an entangling operation. Bell-state measurement is the crucial step in quantum teleportation. The result of a Bell-state measurement is used by one's co-conspirator to reconstruct the original state of a teleported particle from half of an entangled pair (the "quantum channel") that was previously shared between the two ends. Experiments which utilize so-called "linear evolution, local measurement" techniques cannot realize a complete Bell state measurement. Linear evolution means that the detection apparatus acts on each particle independently from the state or evolution of the other, and local measurement means that each particle is localized at a particular detector registering a "click" to indicate that a particle has been detected. Such devices can be constructed, for example, from mirrors, beam splitters, and wave plates, and are attractive from an experimental perspective because they are easy to use and have a high measurement cross-section. For entanglement in a single qubit variable, only three distinct classes out of four Bell states are distinguishable using such linear optical techniques. This means two Bell states cannot be distinguished from each other, limiting the efficiency of quantum communication protocols such as teleportation. If a Bell state is measured from this ambiguous class, the teleportation event fails. Entangling particles in multiple qubit variables, such as (for photonic systems) polarization and a two-element subset of orbital angular momentum states, allows the experimenter to trace over one variable and achieve a complete Bell state measurement in the other. Leveraging so-called hyper-entangled systems thus has an advantage for teleportation. It also has advantages for other protocols such as superdense coding, in which hyper-entanglement increases the channel capacity. In general, for hyper-entanglement in variables, one can distinguish between at most classes out of Bell states using linear optical techniques. - Nielsen, Michael A.; Chuang, Isaac L. (2000), Quantum computation and quantum information, Cambridge University Press, ISBN 978-0-521-63503-5, pp. 25. - Kaye, Phillip; Laflamme, Raymond; Mosca, Michele (2007), An introduction to quantum computing, Oxford University Press, ISBN 978-0-19-857049-3, pp. 75. - On the Einstein Podolsky and Rosen paradox, Bell System Technical Journal, 1964. - Quantum Entanglement in Electron Optics: Generation, Characterization, and Applications, Naresh Chandra, Rama Ghosh, Springer, 2013, ISBN 3642240704, p. 43, Google Books - Kwiat, Weinfurter. "Embedded Bell State Analysis" - Pisenti, Gaebler, Lynn. "Distinguishability of Hyper-Entangled Bell States by Linear Evolution and Local Measurement"
<urn:uuid:bc41ab3b-cab8-472f-ac1c-e376058f2a6f>
CC-MAIN-2014-35
http://en.wikipedia.org/wiki/Bell_state
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500835872.63/warc/CC-MAIN-20140820021355-00312-ip-10-180-136-8.ec2.internal.warc.gz
en
0.92213
1,330
4.15625
4
Quantum computing has long been a wacky, borderline fictional, mostly theoretical domain of physics reserved for highly speculative conversation. This is because quantum mechanics, or particle physics as it’s also called, makes some claims completely void of common sense. Particle physicists believe that a subatomic particle called a neutrino can pass through the entire Earth without slowing down, and that particles can be in two different states at the same time, and even that two particles can be entangled in such a way that their properties will match across any distance (imagine if flipping a light switch in Kansas caused a light switch on Saturn to flip as well). Various governments have poured money into the exploration of these theories—a giant sub-atomic roller rink was built in Geneva, Switzerland to test many of them resulting in the discovery of the Higgs Boson. But there hasn’t been much use for these theories in practical application. That is until the concept of quantum computing came about. If a particle can be in two states at once, then perhaps this could be used to speed up computation by incredible amounts. Traditional bits in computers can either be on (1) or off (0), but quantum bit, or qubits, can be both on and off at the same time (called superposition), allowing them to perform parallel computations at once. Make a device that runs with qubits as the base system and you’ve got a quantum computer. The ability to be both on and off simultaneously allows quantum computers to use a process called annealing and makes a computer extraordinarily faster as it’s able to process all scenarios at once. A quantum computer with just 300 qubits could run more calculations in an instant than there are particles in the whole universe. But all of this is just in theory. As the theories about quantum computing grew and they became entangled with increased media attention and speculation about their capabilities, the idea of quantum computing morphed into miracle computing. It was believed if quantum computers existed, they could cure disease, power artificially intelligent robots, make time machines function, drive cars, and solve the problems of global warming. While all the speculation was brewing and more scientists and researchers wrote papers on the matter, one company started to build quantum computers. In 2011, a Canadian company D-Wave (backed by the CIA and individuals like Jeff Bezos of Amazon) sold its first machine to defense contractor Lockheed Martin. In early May, D-Wave sold its second machine, the 439-qubit D-Wave Two, to the Quantum Artificial Intelligence Lab for $15 million. The lab, which is backed by NASA, Google, and the Universities Space Research Association (USRA) will use the device to make advances in machine learning, a field of computer science where computers become more adept at solving problems with the more experience they have. Research Catherine McGeoch, a professor of computer science at Amherst College, but theory into practice and tested the D-Wave prototype to see if it really was a quantum leap forward. Her findings conclude that the device is fast, but only at specific tasks. “On the largest problem sizes tested, the V5 chip found optimal solutions in less than half a second, while the best software solver, CPLEX, needed 30 minutes to find all optimal solutions,” McGeoch writes in the conclusions section of her academic paper, where CPLEX is a conventional software solver and V5 is the chip in the D-Wave prototype. They received a second V6 chip after most of the study had finished, but they decided to test it anyway, concluding “V6 is three to five times faster than V5″ and “preliminary results suggest that… the hardware can find optimal solutions around 10,000 times faster than CPLEX.” But these incredible numbers can be a little misleading. This doesn’t say that the D-Wave Two is generally 3,600 to 10,000 times faster than a conventional computer, rather that it solved a specific problem that much faster than the current standard solver CPLEX. As McGeoch told the New Yorker after the many media organizations stated the quantum computer was 3,600 times faster, “the 3,600 number does not give any information about comparative performance of the two types of platforms. It was never intended to.” Another misleading detail is that the baseline machines that the D-Wave Two was being compared against are simple desktop machines that cost only $1,200. The D-Wave machine wasn’t being compared to state-of-the-art supercomputers, but with something you could more or less pick up at Best Buy. For the cost of one D-Wave Two, you could buy 12,500 of the traditional machines. This doesn’t exactly seem like a fair comparison, and it doesn’t even account for the fact that there need to be incredible conditions to make the D-Wave Two run. Because the machine’s chip requires a near absence of electrical resistivity to function called superconductivity, the machine must be supercooled to nearly absolute zero. Furthermore, there’s even some doubt as to whether D-Wave’s machines are actually quantum computers. It’s very difficult to tell if a device is actually using a process called quantum tunneling or if a similar effect is being achieved through normal thermal fluctuations. McGeoch even admitted she wasn’t sure how the machine actually operated and simply deferred to previous research that said the D-Wave machine is “at least a little quantum mechanical.” All things considered, it seems we only have an incredibly expensive machine that looks like a quantum computer. Even at its best, a true quantum computer isn’t the magical solution we might be looking for. Due to the way quantum devices are structured, they may excel at solving specific problems that require multiple calculations simultaneously (like determining how to seat guests at the dinner table so people who dislike each other aren’t placed together), they fall short at doing other computational tasks like running Photoshop or Microsoft Word or browsing Facebook. While this is definitely a simplification, it assists the point that with the current state of the technology, quantum computers will have to be coupled with traditional computers to comprehensively perform tasks. In Google’s announcement surrounding purchasing the D-Wave Two, Director of Engineering Hartmut Neven admitted that in trying to better understand machine learning “we’ve learned some useful principles: e.g., you get the best results not with pure quantum computing, but by mixing quantum and classical computing.” Even though quantum mechanical devices may be here to the tune of $15 million, there still will be years of research, development, and debate to determine if the technology is the miracle device science fiction hopes it is. “Launching the Quantum Artificial Intelligence Lab”, Google Adrian Cho, “Controversial Computer Is at Least a Little Quantum Mechanical”, Science Gary Marcus, “A Quantum Leap In Computing?”, The New Yorker Charles Choi, “Google and NASA Launch Quantum Computing AI Lab”, MIT Technology Review Catherine C. McGeoch, “Experimental Evaluation of an Adiabiatic Quantum System for Combinatorial Optimization” John Naughton, “Is computing speed set to make a quantum leap?” The Guardian
<urn:uuid:cf6ac822-f262-4bdc-b90e-0d078bac9f5b>
CC-MAIN-2014-35
http://theairspace.net/science/quantum-computing-is-real-but-not-very-useful/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1409535925433.20/warc/CC-MAIN-20140901014525-00176-ip-10-180-136-8.ec2.internal.warc.gz
en
0.943635
1,532
3.5625
4
Squeezed coherent state In physics, a squeezed coherent state is any state of the quantum mechanical Hilbert space such that the uncertainty principle is saturated. That is, the product of the corresponding two operators takes on its minimum value: Often, the term squeezed state is used for any such state with in "natural oscillator units". The idea behind this is that the circle denoting a coherent state in a quadrature diagram (see below) has been "squeezed" to an ellipse of the same area. The most general wave function that satisfies the identity above is the squeezed coherent state (we work in units with ) where are constants (a normalization constant, the center of the wavepacket, its width, and the expectation value of it's momentum). The new feature relative to a coherent state is the free value of the width , which is the reason why the state is called "squeezed". The squeezed state above is an eigenstate of a linear operator and the corresponding eigenvalue equals . In this sense, it is a generalization of the ground state as well as the coherent state. Examples of squeezed coherent states Depending on at which phase the state's quantum noise is reduced, one can distinguish amplitude-squeezed and phase-squeezed states or general quadrature squeezed states. If no coherent excitation exists the state is called a squeezed vacuum. The figures below give a nice visual demonstration of the close connection between squeezed states and Heisenberg's uncertainty relation: Diminishing the quantum noise at a specific quadrature (phase) of the wave has as a direct consequence an enhancement of the noise of the complementary quadrature, that is, the field at the phase shifted by . From the top: - Vacuum state - Squeezed vacuum state - Phase-squeezed state - arbitrary squeezed state - Amplitude-squeezed state As can be seen at once, in contrast to the coherent state the quantum noise for a squeezed state is no longer independent of the phase of the light wave. A characteristic broadening and narrowing of the noise during one oscillation period can be observed. The wave packet of a squeezed state is defined by the square of the wave function introduced in the last paragraph. They correspond to the probability distribution of the electric field strength of the light wave. The moving wave packets display an oscillatory motion combined with the widening and narrowing of their distribution: the "breathing" of the wave packet. For an amplitude-squeezed state, the most narrow distribution of the wave packet is reached at the field maximum, resulting in an amplitude that is defined more precisely than the one of a coherent state. For a phase-squeezed state, the most narrow distribution is reached at field zero, resulting in an average phase value that is better defined than the one of a coherent state. In phase space, quantum mechanical uncertainties can be depicted by Wigner distribution Wigner quasi-probability distribution. The intensity of the light wave, its coherent excitation, is given by the displacement of the Wigner distribution from the origin. A change in the phase of the squeezed quadrature results in a rotation of the distribution. Photon number distributions and phase distributions of squeezed states For amplitude squeezed light the photon number distribution is usually narrower than the one of a coherent state of the same amplitude resulting in sub-Poissonian light, whereas its phase distribution is wider. The opposite is true for the phase-squeezed light, which displays a large intensity (photon number) noise but a narrow phase distribution. Nevertheless the statistics of amplitude squeezed light was not observed directly with photon number resolving detector due to experimental difficulty. For the squeezed vacuum state the photon number distribution displays odd-even-oscillations. This can be explained by the mathematical form of the squeezing operator, that resembles the operator for two-photon generation and annihilation processes. Photons in a squeezed vacuum state are more likely to appear in pairs. Experimental realizations of squeezed coherent states There has been a whole variety of successful demonstrations of squeezed states. The most prominent ones were experiments with light fields using lasers and non-linear optics (see optical parametric oscillator). This is achieved by a simple process of four-wave mixing with a crystal; similarly traveling wave phase-sensitive amplifiers generate spatially multimode quadrature-squeezed states of light when the crystal is pumped in absence of any signal. Sub-Poissonian current sources driving semiconductor laser diodes have led to amplitude squeezed light. Squeezed states have also been realized via motional states of an ion in a trap, phonon states in crystal lattices, or atom ensembles. Even macroscopic oscillators were driven into classical motional states that were very similar to squeezed coherent states. Current state of the art in noise suppression, for laser radiation using squeezed light, amounts to 12.7 dB. Squeezed states of the light field can be used to enhance precision measurements. For example phase-squeezed light can improve the phase read out of interferometric measurements (see for example gravitational waves). Amplitude-squeezed light can improve the readout of very weak spectroscopic signals. Various squeezed coherent states, generalized to the case of many degrees of freedom, are used in various calculations in quantum field theory, for example Unruh effect and Hawking radiation, and generally, particle production in curved backgrounds and Bogoliubov transformation). Recently, the use of squeezed states for quantum information processing in the continuous variables (CV) regime has been increasing rapidly. Continuous variable quantum optics uses squeezing of light as an essential resource to realize CV protocols for quantum communication, unconditional quantum teleportation and one-way quantum computing. This is in contrast to quantum information processing with single photons or photon pairs as qubits. CV quantum information processing relies heavily on the fact that squeezing is intimately related to quantum entanglement, as the quadratures of a squeezed state exhibit sub-shot-noise quantum correlations. - Loudon, Rodney, The Quantum Theory of Light (Oxford University Press, 2000), [ISBN 0-19-850177-3] - D. F. Walls and G.J. Milburn, Quantum Optics, Springer Berlin 1994 - C W Gardiner and Peter Zoller, "Quantum Noise", 3rd ed, Springer Berlin 2004 - D. Walls, Squeezed states of light, Nature 306, 141 (1983) - R. E. Slusher et al., Observation of squeezed states generated by four wave mixing in an optical cavity, Phys. Rev. Lett. 55 (22), 2409 (1985) - G. Breitenbach, S. Schiller, and J. Mlynek, "Measurement of the quantum states of squeezed light", Nature, 387, 471 (1997) - Entanglement evaluation with Fisher information - http://arxiv.org/pdf/quant-ph/0612099 - S. Machida et al.,Observation of amplitude squeezing in a constant-current–driven semiconductor laser, Phys. Rev. Lett. 58, 1000–1003 (1987) - http://link.aps.org/doi/10.1103/PhysRevLett.58.1000 - T. Eberle et al., Quantum Enhancement of the Zero-Area Sagnac Interferometer Topology for Gravitational Wave Detection, Phys. Rev. Lett., 22 June 2010 - http://arxiv.org/abs/1007.0574 - S. L. Braunstein and P. van Loock, “Quantum information with continuous variables,” Rev. Mod. Phys., vol. 77, no. 2, pp. 513–577, Jun. 2005. http://link.aps.org/doi/10.1103/RevModPhys.77.513 - A. Furusawa, J. L. Sørensen, S. L. Braunstein, C. A. Fuchs, H. J. Kimble, and E. S. Polzik, “Unconditional Quantum Teleportation,” Science, vol. 282, no. 5389, pp. 706–709, 1998. http://www.sciencemag.org/content/282/5389/706.abstract - N. C. Menicucci, S. T. Flammia, and O. Pfister, “One-Way Quantum Computing in the Optical Frequency Comb,” Phys. Rev. Lett., vol. 101, no. 13, p. 130501, Sep. 2008. http://link.aps.org/doi/10.1103/PhysRevLett.101.130501
<urn:uuid:081e50a6-0450-4424-acef-32df007f5dbc>
CC-MAIN-2014-35
http://en.wikipedia.org/wiki/Squeezed_coherent_state
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500836108.12/warc/CC-MAIN-20140820021356-00382-ip-10-180-136-8.ec2.internal.warc.gz
en
0.865182
1,843
3.828125
4
Computer network Communication Devices Introduction to computer network devices Learning about network types and configuration remains incomplete unless we get to know the devices which help in communication between computers in any given network. Without the communication devices networks cannot be formed so knowing their names and what are their uses are equally important. To develop LAN network following network communication devices are required which are listed below: NIC is Network Interface Card; this is the most important device in building network.These adapters are the most common part of computers which are used in our homes and offices.Nic is also referred to LAN, i.e. is Local area network card. Communication mediums (cables) are attached to this card to build network. This device has unique Mac address. To build network unique IP address is assign to this LAN card to begun communication.In case of developing WLAN, instead of LAN card we use Wireless card. Its functionality is same as simple LAN card; it is just wireless communication device which connects to router for communication. Router is intelligent device which routes data to destination computers. It helps in connecting two different logical and physical networks together. In small network server is connected to router along with clients for communication. With routers network communication is not possible; it is soul of network without which distribution if internet and other network data to entire network is impossible. It works very same when it comes to use wireless network using wireless network router. It performs all functions similarly without using any medium like cables etc.Router uses software known as routing table. Routing table is used to store source and destination address. Major companies which know for manufacturing routers and wireless routers are Tp Link, Cisco systems, Nortel, D link etc. If we talk about networks on larger scale hub(s) are required to build network. All computers are connected directly to the hub as hub performs as centralized device the network. When data is sent to the hub it broadcasts the data to all the ports of the hub and then it is sent to destination computer on the network. If hubs fails to perform its routine functions it will halt the working of the entire network until it is put back in normal condition. Switch is another important device when we talk about computer network on broader spectrum.It is used at the same place as hub is but the only difference between the two is that switch possess switching table with in it. Switching tables store the MAC addresses of every computer it is connected to and send the data to only requested address unlike hub which broadcasts the data too all the ports. Switches can be considered advance form of hubs. As name suggests it some kind of passing through to some thing. Interestingly gateways can be software or it can also be device. Gateway device connects LAN with internet. Its basic functionality is to provide security to the network. By using gateways incoming/out going traffic can be monitored for any malicious activity within the network which can be harmful to network integrity. Modems can be of two types. One modem is very common in every computer which we use to connect to internet using our telephone line by dialing to our ISP and the other one is used to connect to DSL. Functions however are same for both types of modems; they are used for modulation and demodulation, they are used to convert analog signals into digital and digital signals into analog so that signals can be travelled on telephone lines. Cables are obviously used to connect communication devices with each other to form network. There different types of cables, commonly used cables are 10baseT/CAT5 , coaxial cable, Ethernet and fiber optical cable. Fiber optical is the most expensive as it enables the data transfer at speed of light. It is costly solution which is mostly get adopted by corporate sector. However in recent developments optical fiber cable is now being used in home networking and also used as medium to connect to internet. - Network Configuration - What is Network address translator (Nat) ? - Types of network cables Useful & Related Links Interested in Advertising your products or website with us? Click Why Advertising with us ? Other Improtant topics Computer Network Architechture :: Data recovery :: What is Data Mining & techniques :: Security issues of Computer :: Frame Relay :: How to create wireless groups :: How to design security policy for network :: How to Troubleshoot LAN :: How to Troubleshoot WLAN :: Infrared Network :: Introduction to Active Directory :: Network Management Software :: Network ports List :: Network Security Software :: Networking FAQ :: Online Security Threat :: Satellite Communication :: Submarine Communication Cable :: Telecommunication Networks :: WAN Technology :: What is Cryptography :: What is Optical Router :: Working Of Telnet :: Linux Server Adminstatrion :: Wireless Bridges set up techniques :: Digital Communication :: How to Configure Linksys wireless bridge :: How to setup wireless repeater :: Distributed Computing :: Hight Performance Computing :: Parallel computing :: Quantum Computing :: Super Computing :: Cloud Computing :: How to configure print server :: How video conferencing works :: Setting up TCP/IP network :: Recover lost hard drive data :: How to solve network performance problems :: 3GPP2 Multimedia Domain Architecture :: Network management model and architechture :: What is protocol analysis & Analyzer :: What is network address translator :: Internet network architecture :: Types of information technology :: What is DSL technology :: Dsl concept :: Dsl vs Cable internet :: Network simulator :: Next generation networks :: What is Switched mesh :: What is 127.0.0.1 :: How to change mac address :: How to flush dns :: EV-DO Rev. B Technology? :: What is network protocol :: What is ASIC :: Blu ray Technology :: Field Program Gate Array (FPGA) :: Computer networking with ethernet hub :: Intelligent networks :: Adsl problems and oppertunities :: Dsl components :: What is hub :: What is networking switch :: Hubs Vs Switches :: Frame relay networks Browse All Categories - WiFi Technology - Wimax Technology - Computer Networks - Mobile Communication - IT - Certifications - Computer OS - Computer Hardware - Computer security - Technology Reviews - Networking Tutorials - Other Technology articles - Top 10 Lastest articles in Category
<urn:uuid:4cc75419-1a09-4b4a-91ec-8322c29ff70d>
CC-MAIN-2014-35
http://www.wifinotes.com/computer-networks/network-communication-devices.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500829754.11/warc/CC-MAIN-20140820021349-00396-ip-10-180-136-8.ec2.internal.warc.gz
en
0.911419
1,266
3.546875
4
Want to stay on top of all the space news? Follow @universetoday on Twitter The recent list of Universe Today’s Top 10 Stories of 2010 included the story Faster than Light Pulsars Discovered – which on further reading made it clear that the phenomenon being studied wasn’t exactly moving faster than light. Anyhow, this prompted me to look up different ways in which apparent superluminal motion might be generated, partly to reassure myself that the bottom hadn’t fallen out of relativity physics and partly to see if these things could be adequately explained in plain English. Here goes… 1) Cause and effect illusions The faster than light pulsar story is essentially about hypothetical light booms – which are a bit like a sonic booms, where it’s not the sonic boom, but the sound source, that exceeds the speed of sound – so that individual sound pulses merge to form a single shock wave moving at the speed of sound. Now, whether anything like this really happens with light from pulsars remains a point of debate, but one of the model’s proponents has demonstrated the effect in a laboratory – see this Scientific American blog post. What you do is to arrange a line of light bulbs which are independently triggered. It’s easy enough to make them fire off in sequence – first 1, then 2, then 3 etc – and you can keep reducing the time delay between each one firing until you have a situation where bulb 2 fires off after bulb 1 in less time than light would need to travel the distance between bulbs 1 and 2. It’s just a trick really – there is no causal connection between the bulbs firing – but it looks as though a sequence of actions (first 1, then 2, then 3 etc) moved faster than light across the row of bulbs. This illusion is an example of apparent superluminal motion. There are a range of possible scenarios as to why a superluminal Mexican wave of synchrotron radiation might emanate from different point sources around a rapidly rotating neutron star within an intense magnetic field. As long as the emanations from these point sources are not causally connected, this outcome does not violate relativity physics. 2) Making light faster than light You can produce an apparent superluminal motion of light itself by manipulating its wavelength. If we consider a photon as a wave packet, that wave packet can be stretched linearly so that the leading edge of the wave arrives at its destination faster, since it is pushed ahead of the remainder of the wave – meaning that it travels faster than light. However, the physical nature of ‘the leading edge of a wave packet’ is not clear. The whole wave packet is equivalent to one photon – and the leading edge of the stretched out wave packet cannot carry any significant information. Indeed, by being stretched out and attenuated, it may become indistinguishable from background noise. Also this trick requires the light to be moving through a refractive medium, not a vacuum. If you are keen on the technical details, you can make phase velocity or group velocity faster than c (the speed of light in a vacuum) – but not signal velocity. In any case, since information (or the photon as a complete unit) is not moving faster than light, relativity physics is not violated. 3) Getting a kick out of gain media You can mimic more dramatic superluminal motion through a gain medium where the leading edge of a light pulse stimulates the emission of a new pulse at the far end of the gain medium – as though a light pulse hits one end of a Newton’s Cradle and new pulse is projected out from the other end. If you want to see a laboratory set-up, try here. Although light appears to jump the gap superluminally, in fact it’s a new light pulse emerging at the other end – and still just moving at standard light speed. 4) The relativistic jet illusion If an active galaxy, like M87, is pushing out a jet of superheated plasma moving at close to the speed of light – and the jet is roughly aligned with your line of sight from Earth – you can be fooled into thinking its contents are moving faster than light. If that jet is 5,000 light years long, it should take at least 5,000 years for anything in it to cross that distance of 5,000 light years. A photon emitted by a particle of jet material at point A near the start of the jet really will take 5,000 years to reach you. But meanwhile, the particle of jet material continues moving towards you nearly as fast as that photon. So when the particle emits another photon at point B, a point near the tip of the jet – that second photon will reach your eye in much less than 5,000 years after the first photon, from point A. This will give you the impression that the particle crossed 5,000 light years from points A to B in much less than 5,000 years. But it is just an optical illusion – relativity physics remains unsullied. 5) Unknowable superluminal motion It is entirely possible that objects beyond the horizon of the observable universe are moving away from our position faster than the speed of light – as a consequence of the universe’s cumulative expansion, which makes distant galaxies appear to move away faster than close galaxies. But since light from hypothetical objects beyond the observable horizon will never reach Earth, their existence is unknowable by direct observation from Earth – and does not represent a violation of relativity physics. And lastly, not so much unknowable as theoretical is the notion of early cosmic inflation, which also involves an expansion of space-time rather than movement within space-time – so no violation there either. I’m not sure that the above is an exhaustive list and I have deliberately left out other theoretical proposals such as quantum entanglement and the Alcubierre warp drive. Either of these, if real, would arguably violate relativity physics – so perhaps need to be considered with a higher level of skepticism.
<urn:uuid:2f0a599b-df83-47f7-ab3f-ad2c75418073>
CC-MAIN-2014-35
http://www.universetoday.com/81918/astronomy-without-a-telescope-apparent-superluminal-motion/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500830834.3/warc/CC-MAIN-20140820021350-00132-ip-10-180-136-8.ec2.internal.warc.gz
en
0.942817
1,254
3.9375
4
The discovery and application by IBM researcher Stuart Parkin and his colleagues of a “spin valve”—essentially the capability to alter the magnetic state of materials at the atomic level—changed the landscape of magnetic data storage by dramatically increasing storage capacity. This helped pave the way for some of today’s most popular devices and online applications. The word spintronics—short for spin electronics—was coined in the 1990s to describe devices that take advantage of “spin,” a quantum-mechanical property of an electron that takes only two values: spin-up and spin-down. Spintronics research flowered following the discovery of the giant magnetoresistance (GMR) effect in the late 1980s. IBM Almaden Research Center researchers realized that GMR could be used to make more sensitive hard disk drive read heads. Parkin discovered the fundamental underlying spintronics phenomena that made the spin valve a reality while researching novel properties of superlattices formed from combinations of various magnetic and non-magnetic materials based on flowing charge currents through these superlattices. By working at the atomic scale, he discovered that by sandwiching a non-magnetic layer of material between two magnetic layers, where each of the layers was just a few atoms thick, and by applying small magnetic fields, the current flowing through the sandwich could significantly be changed. The reason was that within the magnetic layers, the electrical current, which was composed of negatively charged electrons, became “spin-polarized”: all the electrons’ spins became oriented either “up” or “down,” depending on the magnetic orientation of these layers—just like nanoscopic compass needles, which point to either the North or South Pole. Small magnetic fields reorient these compass needles. This effectively created the ability to turn the “spin-polarized” current on or off—just like a valve. The spin valve also created the ability to detect more minute magnetic impulses when flown over a magnetic hard drive. This ability allowed for vastly more data to be written to and stored on a hard drive than was possible before the discovery of GMR. The first use of spin-valve sensors in hard disk drive read heads was in the “An I.B.M. research fellow largely unknown outside a small fraternity of physicists, Mr. Parkin puttered for two years in a lab in the early 1990s, trying to find a way to commercialize an odd magnetic effect of quantum mechanics he had observed at supercold temperatures. With the help of a research assistant, he was able to manipulate the alignment of electronics to alter the magnetic state of tiny areas of a magnetic data storage disc, making it possible to store and retrieve information in a smaller amount of space. The huge increases in digital storage made possible by giant magnetoresistance, or GMR, made consumer audio and video iPods, as well as Google-style data centers, a reality.” “Redefining the Architecture of Memory,” The New York TimesSeptember 11, 2007 “The first mass-produced spintronic device has already revolutionized the hard-disk drive industry. Introduced in 1997, the giant magnetoresistive (GMR) head, developed at the IBM Almaden lab, is a super-sensitive magnetic-field sensor that enabled a 40-fold increase in data density over the past seven years. Another multilayered spintronic structure is at the heart of the high-speed, nonvolatile magnetic random access memory (MRAM), currently being developed by a handful of companies.” “IBM, Stanford Collaborate on World-Class Spintronics Research,” PhysOrg.comApril 28, 2004 “Magnetoresistive random access memory (MRAM) is expected to revolutionize the memory market and contribute to the development of advanced and versatile computing and personal devices. Promising advances such as instantly bootable computers, MRAM could well be the next big thing in spintronics. Quantum computation is perhaps one of the most exciting potential applications of spintronics. However, harnessing the power of the quantum states to enable information processing and storage is not easy. The evolution of MRAMs and various spin-based technologies could be critically important in facilitating the development of the first quantum computer.” Spintronics—An Emerging Technology Analysis (Technical Insights), Frost & Sullivan Research ServiceMarch 28, 2005 “Think of one combined unit that integrates logic, storage, and communication for computing. We envision using a mixture of optical, electronic, and photonic techniques to prepare and manipulate spin-based information. The spin could be stored in semiconductors, run at frequencies many times faster than today’s technology and work at room temperature. And all in a single nanostructure. Then imagine millions of these nanostructures working together in a device small by human standards. What such devices will do is up to scientists and engineers to determine. But the most exciting prospects are the revolutionary ones rather than simple extrapolations of today’s technology.” “Controlling Electron Spin Electrically,” Science a GoGoDecember 28, 2001 These huge increases in storage capacity made possible the evolution of giant data centers in the “cloud.” Perhaps most importantly, the ability to store and access huge amounts of data in worldwide networks helped create the information-based world of today. In 2005 alone, the amount of data that could be stored by all the spin-valve-enabled hard drives sold equaled all of the analog data available in the world at that time—approximately 100 exabytes. Since 2007, the basic spin valve has evolved to a related thin-layered structure—magnetic tunnel junction—that displays giant tunneling magnetoresistance (TMR), a phenomenon where electrons tunnel through a thin insulator. The non-magnetic layer in a GMR spin valve has been replaced by this insulator, which, when formed from “magnesium oxide,” is a spin filter that only allows electrons of one spin direction through it, like a gatekeeper. The current that flows through magnesium oxide is composed of electrons that are almost 100 percent spin-up or spin-down, depending on the magnetic orientation of the surrounding magnetic layers. This means the TMR signal is much larger than that from a GMR spin valve: indeed it is almost 100 times larger. TMR is also the basis of magnetic random access memory (MRAM), a new type of non-volatile memory that uses magnetic moments to retain data instead of electrical charges. Stuart Parkin is now leading a team of IBM researchers in studying Racetrack Memory, a radically different non-volatile memory technology proposed by Parkin in 2004 that is based on a recently discovered spintronics phenomena. Racetrack memory uses currents of spin-oriented electrons to “move” magnetic regions along magnetic racetracks—nanoscopic magnetic wires. Racetrack memory is one of a number of new technologies being explored that could offer higher storage density than comparable devices such as flash memory, and eventually replace disk drives with a solid-state memory device. Throughout its history, IBM has collaborated with external entities, including universities, organizations and other corporations to advance research in a variety of technologies. In 2004, the IBM-Stanford Spintronic Science and Applications Center (SpinAps) was established in California. Within SpinAps, scientists and engineers from IBM Almaden Research Center are working together with Stanford faculty, students and post-doctoral fellows to study the theoretical and practical fundamentals of spintronics, and to develop advanced technologies built on those fundamentals. Spintronics may also enable the leap to quantum computing where units of quantum information known as “qubits” can occupy spin-up and spin-down states simultaneously, and so allow for massive increases in computational power. Selected team members who contributed to this Icon of Progress: - Dr. Stuart Parkin IBM Fellow, manager of the Magnetoelectronics group at the IBM Almaden Research Center, co-director of the IBM-Stanford Spintronic Science and Applications Center - Dr. Stuart A. Wolf Program manager at DARPA; coined the term spintronics in 1996 - Dr. James S. Harris Co-director of the IBM-Stanford Spintronic Science and Applications Center; James and Ellenor Chesebrough Professor in the Electrical Engineering Department of Stanford University - Dr. Schoucheng Zhang Co-director of the IBM-Stanford Spintronic Science and Applications Center; J. G. Jackson and C. J. Wood Professor in Physics at Stanford University - Dr. David J. Smith Regents’ Professor of Physics at Arizona State University
<urn:uuid:1edfddc6-734c-4774-859a-e4bfa7d9249e>
CC-MAIN-2014-35
http://www-03.ibm.com/ibm/history/ibm100/us/en/icons/spintronics/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500826679.55/warc/CC-MAIN-20140820021346-00243-ip-10-180-136-8.ec2.internal.warc.gz
en
0.92181
1,826
3.890625
4
In logic circuits, the Toffoli gate (also CCNOT gate), invented by Tommaso Toffoli, is a universal reversible logic gate, which means that any reversible circuit can be constructed from Toffoli gates. It is also known as the "controlled-controlled-not" gate, which describes its action. It has 3-bit inputs and outputs; if the first two bits are set, it inverts the third bit, otherwise all bits stay the same. A logic gate L is reversible if, for any output y, there is a unique input x such that applying L(x) = y. If a gate L is reversible, there is an inverse gate L′ which maps y to x for which L′(y) = x. From common logic gates, NOT is reversible, as can be seen from its truthtable below. The common AND gate is not reversible however. The inputs 00, 01 and 10 are all mapped to the output 0. Reversible gates have been studied since the 1960s. The original motivation was that reversible gates dissipate less heat (or, in principle, no heat). In a normal gate, input states are lost, since less information is present in the output than was present at the input. This loss of information loses energy to the surrounding area as heat, because of thermodynamic entropy. Another way to understand this is that charges on a circuit are grounded and thus flow away, taking a small quantity of energy with them when they change state. A reversible gate only moves the states around, and since no information is lost, energy is conserved. More recent motivation comes from quantum computing. Quantum mechanics requires the transformations to be reversible but allows more general states of the computation (superpositions). Thus, the reversible gates form a subset of gates allowed by quantum mechanics and, if we can compute something reversibly, we can also compute it on a quantum computer. Universality and Toffoli gate Any reversible gate must have the same number of input and output bits, by the pigeonhole principle. For one input bit, there are two possible reversible gates. One of them is NOT. The other is the identity gate which maps its input to the output unchanged. For two input bits, the only non-trivial gate is the controlled NOT gate which XORs the first bit to the second bit and leaves the first bit unchanged. |Truth table||Permutation matrix form| Unfortunately, there are reversible functions that cannot be computed using just those gates. In other words, the set consisting of NOT and XOR gates is not universal. If we want to compute an arbitrary function using reversible gates, we need another gate. One possibility is the Toffoli gate, proposed in 1980 by Toffoli. This gate has 3-bit inputs and outputs. If the first two bits are set, it flips the third bit. The following is a table of the input and output bits: |Truth table||Permutation matrix form| It can be also described as mapping bits a, b and c to a, b and c XOR (a AND b). The Toffoli gate is universal; this means that for any Boolean function f(x1, x2, ..., xm), there is a circuit consisting of Toffoli gates which takes x1, x2, ..., xm and some extra bits set to 0 or 1 and outputs x1, x2, ..., xm, f(x1, x2, ..., xm), and some extra bits (called garbage). Essentially, this means that one can use Toffoli gates to build systems that will perform any desired Boolean function computation in a reversible manner. Related logic gates - The Fredkin gate is a reversible 3-bit gate that swaps the last two bits if the first bit is 1; a controlled-swap operation. - The n-bit Toffoli gate is a generalization of Toffoli gate. It takes n bits x1, x2, ..., xn as inputs and outputs n bits. The first n−1 output bits are just x1, ..., xn−1. The last output bit is (x1 AND ... AND xn−1) XOR xn. - The Toffoli gate can be realized by five two-qubit quantum gates. - This gate is one of the reversible-gate cases that can be modeled with billiard balls (see Billiard-ball computer). The billiard ball modeling was introduced by Fredkin and Toffoli. An example of how the collisions are used to model an electronic gate is shown in the figure. Relation to quantum computing Any reversible gate can be implemented on a quantum computer, and hence the Toffoli gate is also a quantum operator. However, the Toffoli gate can not be used for universal quantum computation, though it does mean that a quantum computer can implement all possible classical computations. The Toffoli gate has to be implemented along with single qubit gates to be used for universal quantum computation. A quantum mechanics-based Toffoli gate has been successfully realized in January 2009 at the University of Innsbruck, Austria. - Technical Report MIT/LCS/TM-151 (1980) and an adapted and condensed version: Toffoli, Tommaso (1980). J. W. de Bakker and J. van Leeuwen, ed. "Reversible computing". Automata, Languages and Programming, Seventh Colloquium. Noordwijkerhout, Netherlands: Springer Verlag. pp. 632–644. doi:10.1007/3-540-10003-2_104. ISBN 3-540-10003-2. - Barenco, Adriano; Bennett, Charles H.; Cleve, Richard; DiVincenzo, David P.; Margolus, Norman; Shor, Peter; Sleator, Tycho; Smolin, John A.; Weinfurter, Harald (Nov 1995). "Elementary gates for quantum computation". Phys. Rev. A (American Physical Society) 52 (5): 3457–3467. arXiv:quant-ph/9503016. Bibcode:1995PhRvA..52.3457B. doi:10.1103/PhysRevA.52.3457. PMID 9912645. - Fredkin, Edward; Toffoli, Tommaso (April 1982). "Conservative logic". International Journal of Theoretical Physics (Springer Netherlands) 21 (3): 219–253. Bibcode:1982IJTP...21..219F. doi:10.1007/BF01857727. ISSN 0020-7748. - Monz, T.; Kim, K.; Hänsel, W.; Riebe, M.; Villar, A. S.; Schindler, P.; Chwalla, M.; Hennrich, M.; Blatt, R. (Jan 2009). "Realization of the Quantum Toffoli Gate with Trapped Ions". R. (American Physical Society) 102 (4): 040501. arXiv:0804.0082. Bibcode:2009PhRvL.102d0501M. doi:10.1103/PhysRevLett.102.040501.
<urn:uuid:def7b37d-713c-433d-80e8-22122bc42333>
CC-MAIN-2014-35
http://en.wikipedia.org/wiki/Toffoli_gate
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500811913.46/warc/CC-MAIN-20140820021331-00262-ip-10-180-136-8.ec2.internal.warc.gz
en
0.852
1,539
3.828125
4
Will we ever realize the sci-fi dream of human teleportation? Physicists have already successfully teleported tiny objects. (See Beam Me Up, Schrödinger for more on the mechanics of quantum teleportation.) What will it take to extend the technique to a living, breathing human being? Quantum teleportation is possible because of two quantum phenomena that are utterly foreign to our everyday experience: entanglement and superposition. Entanglement is the connection that links the quantum states of two particles, even when they are separated: The two particles can be described only by their joint properties. Though there is no classical analogue for entanglement, in his book Dance of the Photons Zeilinger imagined how entanglement might work if it could be applied to a pair of ordinary dice instead of a pair of subatomic particles: “The science fiction Quantum Entanglement Generator produces pairs of entangled dice. These dice do not show any number before they are observed.” In other words, they are in a superposition of states where there is an equal chance of producing any number between one and six. “When one die is observed, it randomly chooses to show a number of dots. Then, the other distant die instantly shows the same number.” This works no matter how far apart the dice are. They can be sitting beside each other or on opposite ends of the universe. In either case, when the particle over here is measured to be in one of many possible states, then we can infer the state of the particle over there, even though no energy, no mass, and no information travels between A and B when the first one is observed. The state of particle B simply is what it is. The difficult concept is that B’s state corresponds with the state of the measured particle A. Entanglement is so confounding that in the early days of quantum theory, when entanglement was supported only by thought experiments and math on paper, Einstein famously derided it as “spooky action at a distance.” Today, though, entanglement has been thoroughly tested and verified. In fact, entangling particles isn’t even the hard part: For physicists, the most difficult task is maintaining the entanglement. An unexpected particle from the surrounding environment—something as insubstantial as a photon—can jostle one of the entangled particles, changing its quantum state. These interactions must be carefully controlled or else this fragile connection will be broken. If entanglement is one gear in the quantum machinery of teleportation, the second critical gear is superposition. Remember the thought experiment about Schrödinger’s cat? A cat, a flask of poison, and a radioactive source are all placed in a sealed box. If the source decays and emits a particle, then the flask breaks and the cat dies. While the box is closed, we can’t know whether the cat is living or dead. Moreover, the cat can be considered both alive and dead until the box is opened: The cat will stay in a superposition of the two states until a “measurement is made—that is, until we look in the box and observe that the cat is either alive or dead. Schrödinger never tried this on a real cat—in fact, he drew up the thought experiment just to demonstrate the apparently preposterous implications of quantum theory, and to force theorists to examine what constitutes a “measurement”—but today scientists have demonstrated that superposition is real using systems that are increasingly large (albeit still much smaller than a cat). In 2010, a group of researchers at the University of California, Santa Barbara demonstrated superposition in a tiny mechanical resonator—like a tuning fork, it vibrates at a characteristic frequency, but just like the cat it doesn’t exist in a single position until measured. Last year, another group of researchers demonstrated quantum superposition in systems of as many as 430 atoms. Before superposition and entanglement appear in a human-scale teleporter, if ever, they will be harnessed for multiple applications in computing. Quantum cryptography uses entanglement to encode messages and detect eavesdropping. Because observation perturbs entanglement, eavesdropping destroys information carried by entangled particles. And if two people each receive entangled particles, they can generate an entirely secure key. Quantum cryptography is an active area of research and some systems are already on the market. Quantum mechanical superposition and entanglement could also be exploited to make faster and more powerful computers that store information in quantum states, known as “qubits,” instead of traditional electronic bits. Quantum computers could solve problems that are intractable for today’s computers. Whether it’s possible to make a working quantum computer is still in question, but roughly two dozen research groups around the world are avidly investigating methods and architectures. So we know how to teleport one particle. But what if we want to make like Captain Kirk and teleport an entire human being? Remember that we wouldn’t be moving Kirk’s molecules from one place to another. He would interact with a suite of previously-entangled particles, and when we read the quantum state we would destroy the complex quantum information that makes his molecules into him while instantly providing the information required to recreate his quantum state from other atoms in a distant location. Quantum mechanics doesn’t forbid it. The rules of quantum mechanics still apply whether you’re talking about a system of two particles or human being made of 1027 atoms. “The size doesn’t matter in and of itself,” says Andrew Cleland, a physicist at the University of California, Santa Barbara. Macroscopic systems like superconductors and Bose-Einstein condensates show quantum effects while arbitrarily large. From an engineering standpoint, though, teleporting larger objects becomes an increasingly tough problem. Cleland comments, “Taking any object and putting it in a quantum state is hard. Two is multiply hard.” Maintaining entanglement between particle requires isolating them from interactions that would break their entanglement. We don’t want Captain Kirk to end up like The Fly, so we need to keep the particles absolutely isolated. What if we start with something simpler: Instead of teleporting a person, can we teleport a much smaller living thing—like a virus? In 2009, Oriol Romero-Isart of the Max-Planck-Institut fur Quantenoptik in Germany and his colleagues proposed just such an experiment. Using current technology, it should be possible to demonstrate superposition in a virus, they argued. They didn’t try it, but laid out a procedure: First, store the virus in a vacuum to reduce interactions with the environment, and then cool it to its quantum ground state before pumping it with enough laser light to create a superposition of two different energy states. This is possible in theory because some viruses can survive cold and vacuum. But humans are hot, and that thermal energy is a problem. “We have quadrillions of quantum states superimposed at the same time, dynamically changing,” says Cleland. Not only are we hot, but we interact strongly with our environment: We touch the ground, we breathe. Ironically, our need to interact with our environment, our sheer physicality, could come between us and the dream of human teleportation.
<urn:uuid:240b1087-5770-4d71-bbb0-7c864b8fee63>
CC-MAIN-2014-35
http://www.pbs.org/wgbh/nova/blogs/physics/2012/02/tangling-with-teleportation/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500815050.22/warc/CC-MAIN-20140820021335-00428-ip-10-180-136-8.ec2.internal.warc.gz
en
0.922529
1,537
3.546875
4
March 17, 2013 | 3 Though the concept of the robot seems to be a modern and a relatively new idea, they have been around for years. The first recording in literature of a possible description of the robot is found in the Iliad in reference to a “a three-legged cauldron that had ears for handles”. Later on, in 1900, we were introduced to Tik-Tok in Frank Baum’s Wizard of Oz. The word robot was first used in 1920 by the Czech writer Karel Čapek in his play R.U.R. (Rossum’s Universal Robots). This would be the first dramatization of a robot under this name. However, robots would come to life and be used for practical purposes in 1962. General Motors was the first company to use a robot for industrial purposes. Since then, robots have been used in many ways. They have come in all shapes and sizes. They have been used in the medical field, the armed forces, and in the space program. Now as we face the 21st century, technology evolves more. A new kind of robot is being studied and researched. This robot is called the quantum robot. The quantum robot is the idea of combining quantum theory with robot technology. In other words, it is a practical use of the combination of quantum computing and robot technology. Quantum computing involves using quantum systems and quantum states to do computations. A robot is an automated machine that is capable of doing a set of complex tasks. In some applications of robots, the programming used to run the robots may be based on artificial intelligence. Artificial Intelligence is the ability of a computer system to operate in a manner similar to human intelligence. Think of artificial intelligence as if you were training a machine to act like a human. Essentially, quantum robots are complex quantum systems.They are mobile systems with on board quantum computers that interact with their environments. Several programs would be involved in the operation of the robot. These programs would be quantum searching algorithms and quantum reinforcement learning algorithms. Quantum reinforcement learning is based on superposition of the quantum state and quantum parallelism. A quantum state is a system that is a set of quantum numbers. The four basic quantum numbers represent the energy level, angular momentum, spin, and magnetization. In the superposition of quantum states, the idea is to get one state to look like another. Let’s say I have two dogs. One dog knows how to fetch a bone (energy level), sit up (angular momentum), give a high five (spin), and shake hands (magnetization). Now, let’s apply the superposition of quantum states. Since one dog has been trained and given the commands, the other dog must learn to mimic or copy what the first dog did. Each time a command is achieved, reinforcement is given. The reinforcement for the dog would be a bone (or no bone if the command is not achieved). In quantum reinforcement learning, it is slightly different. The idea would be similar to an “If-Then” statement. An example would be if the quantum state has a certain energy level, then the angular momentum is certain value. This idea of “If-Then” statements in the quantum world leads to an idea which can be a topic of its own; Quantum Logic. Quantum parallelism simply means that computations can happen at the same time. This allows for all of the quantum numbers of the quantum system to be measured at the same time. If there are multiple quantum systems then; by using the concept of parallelism, all systems can be measured at the same time. Programs used for “quantum searching” are based on quantum random walks. Quantum random walks use probability amplitudes. A probability amplitude allows us to determine that there is more than one possible quantum state. In the classical world, if you type a word “Quantum” in the search engine, you get many results. You may have a tough time finding a needle in a haystack if you use just one word, but if you want to refine your search; let’s say “Quantum Random Walks”, then it narrows the search. The same principle applies in quantum computing to get more refined results. However, you are not necessarily searching for words but you are finding information that may correlate to a quantum state. What would be the advantages of the Quantum Robot over the Robot? Quantum robots are more intricate in examining their environments and doing tasks as they apply quantum effects . Because of the complexity in quantum computing, the expectations of the quantum robots would be that they are faster, more accurate, and are able to multitask better than the standard robot. The quantum robots may be able one day to give us better medical diagnoses and better data interpretation in other research fields such as defense research. In medicine, they may be able to detect pathological changes in the body by being injected through the bloodstream. In the space program, they may be able to examine the delicate environments on other planets. In the military, they may be able to detect changes in the magnetic and electric fields. They may be able to help us detect early warnings of disasters more efficiently. Secrets of the Universe: Past, Present, FutureX
<urn:uuid:7f20d17f-22cf-4695-a9c9-cad49c0d0da7>
CC-MAIN-2014-35
http://blogs.scientificamerican.com/guest-blog/2013/03/17/i-quantum-robot/
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500832155.37/warc/CC-MAIN-20140820021352-00364-ip-10-180-136-8.ec2.internal.warc.gz
en
0.943011
1,088
3.578125
4
August 15, 2000 -- At a technical conference today at Stanford University, IBM-Almaden researcher Isaac Chuang described his team's experiments that demonstrated the world's most advanced quantum computer and the tremendous potential such devices have to solve problems that conventional computers cannot handle. "Quantum computing begins where Moore's Law ends -- about the year 2020, when circuit features are predicted to be the size of atoms and molecules," says Isaac L. Chuang, who led the team of scientists from IBM Research, Stanford University and the University of Calgary. "Indeed, the basic elements of quantum computers are atoms and molecules." Quantum computers get their power by taking advantage of certain quantum physics properties of atoms or nuclei that allow them to work together as quantum bits, or "qubits," to be the computer's processor and memory. By interacting with each other while being isolated from the external environment, theorists have predicted -- and this new result confirms -- that qubits could perform certain calculations exponentially faster than conventional computers. The new quantum computer contains five qubits -- five fluorine atoms within a molecule specially designed so the fluorine nuclei's "spins" can interact with each other as qubits, be programmed by radiofrequency pulses and be detected by nuclear magnetic resonance instruments similar to those commonly used in hospitals and chemistry labs. Using the molecule, Chuang's team solved in one step a mathematical problem for which conventional computers require repeated cycles. The problem is called "order-finding" -- finding the period of a particular function -- which is typical of many basic mathematical problems that underlie important applications such as cryptography. While the potential for quantum computing is huge and recent progress is encouraging, the challenges remain daunting. IBM's five-qubit quantum computer is a research instrument. Commercial quantum computers are still many years away, since they must have at least several dozen qubits before difficult real-world problems can be solved. "This result gives us a great deal of confidence in understanding how quantum computing can evolve into a future technology," Chuang says. "It reinforces the growing realization that quantum computers may someday be able to live up to their potential of solving in remarkably short times problems that are so complex that the most powerful supercomputers can't calculate the answers even if they worked on them for millions of years." Chuang says the first applications are likely to be as a co-processor for specific functions, such as database lookup and finding the solution to a difficult mathematical problem. Accelerating word processing or Web surfing would not be well-suited to a quantum computer's capabilities. Chuang presented his team's latest result today at Stanford University at the Hot Chips 2000 conference, which is organized by the Institute of Electrical and Electronics Engineers' (IEEE) Computer Society. His co-authors are Gregory Breyta and Costantino S. Yannoni of IBM-Almaden, Stanford University graduate students Lieven M.K .Vandersypen and Matthias Steffen, and theoretical computer scientist Richard Cleve of the University of Calgary. The team has also submitted a technical report of their experiment to the scientific journal, Physical Review Letters. History of Quantum ComputingWhen quantum computers were first proposed in the 1970s and 1980s (by theorists such as the late Richard Feynmann of California Institute of Technology, Pasadena, Calif.; Paul Benioff of Argonne National Laboratory in Illinois; David Deutsch of Oxford U. in England., and Charles Bennett of IBM's T.J. Watson Research Center, Yorktown Heights, N.Y.), many scientists doubted that they could ever be made practical. But in 1994, Peter Shor of AT&T Research described a specific quantum algorithm for factoring large numbers exponentially faster than conventional computers -- fast enough to break the security of many public-key cryptosystems. Shor's algorithm opened the doors to much more effort aimed at realizing the quantum computers' potential. Significant progress has been made by numerous research groups around the world. Chuang is currently among the world's leading quantum computing experimentalists. He also led the teams that demonstrated the world's first 2-qubit quantum computer (in 1998 at University of California Berkeley) and 3-qubit quantum computer (1999 at IBM-Almaden). The order-finding result announced today is the most complex algorithm yet to be demonstrated by a quantum computer. Note: Earlier this year, scientists at Los Alamos National Laboratories announced they had achieved quantum coherence in a seven-qubit molecule. While this is a necessary condition for achieving a quantum computer, they have not yet used the molecule as a seven-qubit quantum computer to solve a problem or to implement a quantum algorithm. How a Quantum Computer Works A quantum particle, such as an electron or atomic nucleus, can exist in two states at the same time -- say, with its spin in the up and down states. This constitutes a quantum bit, or qubit. When the spin is up, the atom can be read as a 1, and the spin down can be read as a 0. This corresponds with the digital 1s and 0s that make up the language of traditional computers. The spin of an atom up or down is the same as turning a transistor on and off, both represent data in terms of 1s and 0s. Qubits differ from traditional digital computer bits, however, because an atom or nucleus can be in a state of "superposition," representing simultaneously both 0 and 1 and everything in between. Moreover, without interference from the external environment, the spins can be "entangled" in such a way that effectively wires together a quantum computer's qubits. Two entangled atoms act in concert with each other -- when one is in the up position, the other is guaranteed to be in the down position. The combination of superposition and entanglement permit a quantum computer to have enormous power, allowing it to perform calculations in a massively parallel, non-linear manner exponentially faster than a conventional computer. For certain types of calculations -- such as complex algorithms for cryptography or searching -- a quantum computer can perform billions of calculations in a single step. So, instead of solving the problem by adding all the numbers in order, a quantum computer would add all the numbers at the same time. To input and read the data in a quantum computer, Chuang's team uses a nuclear magnetic resonance machine, which uses a giant magnet and is similar to the medical devices commonly used to image human soft tissues. A tiny test-tube filled with the special molecule is placed inside the machine and the scientists use radio-frequency pulses as software to alter atomic spins in the particular way that enables the nuclei to perform calculations. Cite This Page:
<urn:uuid:3eb269df-9a68-466c-857f-15e6d495a582>
CC-MAIN-2014-35
http://www.sciencedaily.com/releases/2000/08/000817081121.htm
s3://commoncrawl/crawl-data/CC-MAIN-2014-35/segments/1408500813887.15/warc/CC-MAIN-20140820021333-00386-ip-10-180-136-8.ec2.internal.warc.gz
en
0.934439
1,373
3.65625
4
A Chinese satellite has split pairs of "entangled photons" and transmitted them to separate ground stations 745 miles (1,200 kilometers) apart, smashing the previous distance record for such a feat and opening new possibilities in quantum communication. In quantum physics, when particles interact with each other in certain ways they become "entangled." This essentially means they remain connected even when separated by large distances, so that an action performed on one affects the other. In a new study published online today (June 15) in the journal Science, researchers report the successful distribution of entangled photon pairs to two locations on Earth separated by 747.5 miles (1,203 km). [The 18 Biggest Unsolved Mysteries in Physics] Quantum entanglement has interesting applications for testing the fundamental laws of physics, but also for creating exceptionally secure communication systems, scientists have said. That's because quantum mechanics states that measuring a quantum system inevitably disturbs it, so any attempt to eavesdrop is impossible to hide. But, it's hard to distribute entangled particles — normally photons — over large distances. When traveling through air or over fiber-optic cables, the environment interferes with the particles, so with greater distances, the signal decays and becomes too weak to be useful. In 2003, Pan Jianwei, a professor of quantum physics at the University of Science and Technology of China, started work on a satellite-based system designed to beam entangled photon pairs down to ground stations. The idea was that because most of the particle's journey would be through the vacuum of space, this system would introduce considerably less environmental interference. "Many people then thought it [was] a crazy idea, because it was very challenging already doing the sophisticated quantum-optics experiments inside a well-shielded optical table," Pan told Live Science. "So how can you do similar experiments at thousand-kilometers distance scale and with the optical elements vibrating and moving at a speed of 8 kilometers per second [5 miles per second]?" In the new study, researchers used China's Micius satellite, which was launched last year, to transmit the entangled photon pairs. The satellite features an ultrabright entangled photon source and a high-precision acquiring, pointing and tracking (APT) system that uses beacon lasers on the satellite and at three ground stations to line up the transmitter and receivers. Once the photons reached the ground stations, the scientists carried out tests and confirmed that the particles were still entangled despite having traveled between 994 miles and 1,490 miles (1,600 and 2,400 km), depending on what stage of its orbit the satellite was positioned at. Only the lowest 6 miles (10 km) of Earth's atmosphere are thick enough to cause significant interference with the photons, the scientists said. This means the overall efficiency of their link was vastly higher than previous methods for distributing entangled photons via fiber-optic cables, according to the scientists. [Twisted Physics: 7 Mind-Blowing Findings] "We have already achieved a two-photon entanglement distribution efficiency a trillion times more efficient than using the best telecommunication fibers," Pan said. "We have done something that was absolutely impossible without the satellite." Apart from carrying out experiments, one of the potential uses for this kind of system is for "quantum key distribution," in which quantum communication systems are used to share an encryption key between two parties that is impossible to intercept without alerting the users. When combined with the correct encryption algorithm, this system is uncrackable even if encrypted messages are sent over normal communication channels, experts have said. Artur Ekert, a professor of quantum physics at the University of Oxford in the United Kingdom, was the first to describe how entangled photons could be used to transmit an encryption key. "The Chinese experiment is quite a remarkable technological achievement," Ekert told Live Science. "When I proposed the entangled-based quantum key distribution back in 1991 when I was a student in Oxford, I did not expect it to be elevated to such heights!" The current satellite is not quite ready for use in practical quantum communication systems, though, according to Pan. For one, its relatively low orbit means each ground station has coverage for only about 5 minutes each day, and the wavelength of photons used means it can only operate at night, he said. Boosting coverage times and areas will mean launching new satellites with higher orbits, Pan said, but this will require bigger telescopes, more precise tracking and higher link efficiency. Daytime operation will require the use of photons in the telecommunications wavelengths, he added. But while developing future quantum communication networks will require considerable work, Thomas Jennewein, an associate professor at the University of Waterloo's Institute for Quantum Computing in Canada, said Pan's group has demonstrated one of the key building blocks. "I have worked in this line of research since 2000 and researched on similar implementations of quantum- entanglement experiments from space, and I can therefore very much attest to the boldness, dedication and skills that this Chinese group has shown," he told Live Science. Original article on Live Science.
<urn:uuid:ef2a4213-8fcc-4974-83ab-8a8a18988667>
CC-MAIN-2022-05
https://www.livescience.com/59502-new-quantum-entanglement-record.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303747.41/warc/CC-MAIN-20220122043216-20220122073216-00277.warc.gz
en
0.942431
1,047
3.65625
4
Quantum dice debut Technology Research News Researchers have overcome a major obstacle to generating random numbers on quantum computers by limiting the possibilities in the otherwise unlimited randomness of a set of quantum particles. Random numbers play a key role in classical computing by providing an element of chance in games and simulations, a reliable method for encrypting messages, and a means of accurately sampling huge amounts of data. Researchers from the Massachusetts Institute of Technology and the National Atomic Energy Commission in Argentina have shown that short sequences of random operations -- randomly shifting laser pulses or magnetic fields -- acting on a string of quantum bits can, in effect, generate random configurations of qubits. Being able to generate random numbers in quantum computing could make quantum computers easier to build by countering the noise that eventually destroys qubits, which represent the 1s and 0s of computer information. Quantum computers promise to be fantastically fast at solving certain types of large problems, including the mathematics that underpins today's Quantum random numbers could also be useful for increasing the efficiency of quantum secret-sharing schemes, quantum encryption and various forms of quantum communications. Qubits can represent not only 1 and 0 but any number in between; a string of 100 qubits can represent every possible 100-digit binary number, and a single set of operations can search every possible answer to a problem at once. This gives quantum computers their power, but also poses a problem for generating random numbers. The nearly infinite number of possible qubit configurations theoretically requires an impossibly large number In the quantum world, no outcome is certain, and in most aspects of quantum computing, the goal is to reduce the uncertainty in order to get a definite answer to a problem. The researchers' scheme, however, aims for uncertainty. It limits the possible outcomes without making them The scheme generates quantum states in such a way that the probabilities of the limited set of outcomes are as evenly distributed over the nearly infinite range of possible outcomes as quantum theory allows, said Joseph Emerson, one of the MIT researchers who is now a fellow at the Perimeter Institute for Theoretical Physics in Canada. "These pseudo-random transformations are a practical substitute for truly... random transformations," he said. The number of operations required to represent a truly random configuration increases exponentially with the number of qubits in the configuration. For example, if the quantum equivalent of generating random numbers takes 22, or four, operations for two qubits, 15 qubits would require 215, or 32,768, operations. The researchers' pseudo-random number method could be used to help build quantum computers by providing a practical way to estimate imperfections or errors in quantum processors, said Emerson. "This is addressing a very big problem -- imperfections such as decoherence and inadequate control of the coherence between the qubits are the main limiting factors in the creation of large-scale quantum computers," he said. A quantum particle decoheres, or is knocked out of its quantum state, when it interacts with energy from the environment in the form of light, heat, electricity or magnetism. Researchers are looking for ways to fend off decoherence for as long as possible in order to make qubits last long enough to be useful. A way to estimate decoherence would allow researchers to assess the strength and type of environmental noise limiting the precision of a given quantum device, said Emerson. Random quantum operations can be used as control operations that, when subjected to the noise affecting a prototype quantum computer, will generate a response that depends only on the noise, he said. This way the noise can be characterized with many fewer measurements than existing methods, which are dependent on the interactions of the qubits and so require a number of measurements that increases exponentially with the number of qubits, he said. In addition to helping build quantum computers, random operators would be useful for quantum communications tasks like encryption, said Emerson. "The idea is to randomize a specific configuration of qubits containing the message, and then transmit this randomized state," he said. In this case, if each bit that makes up the message is encrypted, or changed randomly, it is not possible for an eavesdropper to find any type of pattern that may lead to cracking the message. The researchers tested the method on a three-qubit prototype liquid nuclear magnetic resonance (NMR) quantum computer. The computer consists of a liquid sample containing the amino acid alanine, which is a molecule made of three carbon-13 atoms. The qubits are the atoms' spins, which are analogous to a top spinning clockwise or counterclockwise. The two directions, spin up and spin down, can be used to represent 1 and 0. The qubits are controlled by magnetic fields generated by the nuclear magnetic Being able to diagnose faulty quantum computer components in a way that is independent of the number of qubits is very important, said Daniel Lidar, an assistant professor of theoretical chemical physics at the University of Toronto. "For this reason alone I suspect random [operators] will find widespread applications as quantum computer benchmarking becomes an experimental reality," he said. It is also likely that future quantum algorithms will make increasing use of pseudo-random operators, said Lidar. The researchers are working on making the random-number-generation system more precise, said Emerson. "Right now one can only estimate very coarse properties of the noise, such as [its] overall strength," he said. "I would like to devise methods to get a much more detailed analysis of the noise operators." Complete noise-estimation experiments could be implemented in rudimentary quantum computers within the next few years, said Emerson. Researchers generally agree that practical quantum computers are a decade or two away. Emerson's research colleagues were Yaakov S. Weinstein, Marcos Saraceno, Seth Lloyd, and David G. Corey. The work appeared in the December 19, 2003 issue of Science. The research was funded by the National Science Foundation (NSF), the Defense Advanced Research Projects Agency (DARPA) and the Cambridge-MIT Institute. Timeline: 2 years, 10-20 years Funding: Government; University TRN Categories: Quantum Computing and Communications; Physics Story Type: News Related Elements: Technical paper, "Pseudo-Random Unitary Operators for Quantum Information Processing," Science, December 19, 2003 January 14/21, 2004 Quantum dice debut Pressure shapes plastic itself on the go Fiber optics goes nano make nano channels Wet biochip preserves Nanotubes grown on Hardy molecule makes Atoms make quantum Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:869e04c9-98ce-4152-94fd-94784596cea5>
CC-MAIN-2022-05
http://www.trnmag.com/Stories/2004/011404/Quantum_dice_debut_011404.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301475.82/warc/CC-MAIN-20220119155216-20220119185216-00677.warc.gz
en
0.884067
1,595
3.859375
4
Physicists at the National Institute of Standards and Technology (NIST) have harnessed the phenomenon of “quantum squeezing” to amplify and measure trillionths-of-a-meter motions of a lone trapped magnesium ion (electrically charged atom). Described in the June 21 issue of Science, NIST’s rapid, reversible squeezing method could enhance sensing of extremely weak electric fields in surface science applications, for example, or detect absorption of very slight amounts of light in devices such as atomic clocks. The technique could also speed up operations in a quantum computer. “By using squeezing, we can measure with greater sensitivity than could be achieved without quantum effects,” lead author Shaun Burd said. “We demonstrate one of the highest levels of quantum squeezing ever reported and use it to amplify small mechanical motions,” NIST physicist Daniel Slichter said. “We are 7.3 times more sensitive to these motions than would be possible without the use of this technique.” Although squeezing an orange might make a juicy mess, quantum squeezing is a very precise process, which moves measurement uncertainty from one place to another. Imagine you are holding a long balloon, and the air inside it represents uncertainty. Quantum squeezing is like pinching the balloon on one end to push air into the other end. You move uncertainty from a place where you want more precise measurements, to another place, where you can live with less precision, while keeping the total uncertainty of the system the same. In the case of the magnesium ion, measurements of its motion are normally limited by so-called quantum fluctuations in the ion’s position and momentum, which occur all the time, even when the ion has the lowest possible energy. Squeezing manipulates these fluctuations, for example by pushing uncertainty from the position to the momentum when improved position sensitivity is desired. (Story continues below animation.) In NIST’s method, a single ion is held in space 30 micrometers (millionths of a meter) above a flat sapphire chip covered with gold electrodes used to trap and control the ion. Laser and microwave pulses are applied to calm the ion’s electrons and motion to their lowest-energy states. The motion is then squeezed by wiggling the voltage on certain electrodes at twice the natural frequency of the ion’s back-and-forth motion. This process lasts only a few microseconds. After the squeezing, a small, oscillating electric field “test signal” is applied to the ion to make it move a little bit in three-dimensional space. To be amplified, this extra motion needs to be “in sync” with the squeezing. Finally, the squeezing step is repeated, but now with the electrode voltages exactly out of sync with the original squeezing voltages. This out-of-sync squeezing reverses the initial squeezing; however, at the same time it amplifies the small motion caused by the test signal. When this step is complete, the uncertainty in the ion motion is back to its original value, but the back-and-forth motion of the ion is larger than if the test signal had been applied without any of the squeezing steps. To obtain the results, an oscillating magnetic field is applied to map or encode the ion’s motion onto its electronic “spin” state, which is then measured by shining a laser on the ion and observing whether it fluoresces. Using a test signal allows the NIST researchers to measure how much amplification their technique provides. In a real sensing application, the test signal would be replaced by the actual signal to be amplified and measured. The NIST method can amplify and quickly measure ion motions of just 50 picometers (trillionths of a meter), which is about one-tenth the size of the smallest atom (hydrogen) and about one-hundredth the size of the unsqueezed quantum fluctuations. Even smaller motions can be measured by repeating the experiment more times and averaging the results. The squeezing-based amplification technique allows motions of a given size to be sensed with 53 times fewer measurements than would otherwise be needed. Squeezing has previously been achieved in a variety of physical systems, including ions, but the NIST result represents one of the largest squeezing-based sensing enhancements ever reported. NIST’s new squeezing method can boost measurement sensitivity in quantum sensors and could be used to more rapidly create entanglement, which links properties of quantum particles, thus speeding up quantum simulation and quantum computing operations. The methods might also be used to generate exotic motional states. The amplification method is applicable to many other vibrating mechanical objects and other charged particles such as electrons. This work was supported in part by the Army Research Office and the Office of Naval Research. Paper: S.C. Burd, R. Srinivas, J.J. Bollinger, A.C. Wilson, D.J. Wineland, D. Leibfried, D.H. Slichter and D.T.C. Allcock. Quantum amplification of mechanical oscillator motion. Science. Published online June 20, 2019. DOI: 10.1126/science.aaw2884
<urn:uuid:ca27ec28-7feb-4fb2-85d4-0f5ea697798f>
CC-MAIN-2022-05
https://www.nist.gov/news-events/news/2019/06/nist-team-supersizes-quantum-squeezing-measure-ultrasmall-motion
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303864.86/warc/CC-MAIN-20220122134127-20220122164127-00118.warc.gz
en
0.920474
1,081
4.03125
4
Quantum Network is the combination of quantum computing and quantum cryptography system. It follows quantum key distribution algorithm. Using this algorithm, you can perform secure communication on the quantum network. It is a system used for transportation of quantum information between physically separated quantum systems. The nodes in the distributed quantum computing networks, process information using quantum logic gates. If you want to transmit quantum state in the form of photons across large distances that time free space link and optical quantum network play an important role. Many quantum networks get used as quantum key distribution between classical computing environments. This process facilitates the sharing of secret encryption keys between two parties. Find freelancers who have knowledge of quantum Network. In April 2012 at the Max Planck Institute of Quantum in Germany, Gerhard Rempe and other researcher announced their first working quantum network to the world. In a quantum network, transmission performs through Photons alter a long link of highly sensitive atoms. In the case of fibre optics cable transmission performs through tiny glass fibre via light emissions. The quantum signal is also carried through free space connections through light emission but without glass fibre. So it is possible to transmit data through fibre optics in a quicker way. Quantum data is better than binary data as binary data is in the form of 0 and 1s where as quantum data consist of both or neither. Their physical properties add a new dimension to the computer system. Through the quantum process, you can transmit a simple piece of data. Quantum Cryptography is the method which gives high security to information. In this method, if the values of photon changes then target party automatically becomes aware of the attack. The main aim of this process is that to provide security to information from hackers. The central principle of quantum physics is quantum entanglement. The term in which multiple particles are linked together in a way that the quantum state of one particle gets determined by the possible quantum state of other particles. This connection isn’t depending on the particle location in space. The quantum entanglement transmits information quickly. It doesn’t mean violet classical speed of light. The process is most useful for deep space communication and cryptography. For example, in NASA Lunar Atmosphere Dust and Environment Explore demonstrated could be used to download and upload information between spacecraft and ground based receiver. How does Current System Work? In the current system, if you want to send information without any damage, you can use the concept of the encrypted message. To read an encrypted message the receiver need the key, so you have also to send that key. The problem that occurs with the system is that someone hacks the key and intercepts the key transmission. It easily decrypts your message, and the main problem is that your receiver also receives the key, so you don’t understand that your message becomes a hack. Hire freelancers who work on such system. How new Quantum system works: Through normal communication channel, China will send encrypted message from one location to other location. It will pass the quantum key for decrypting the message in the form of a set of photons in the specific quantum state to the satellite. The satellite will then pass that key to the particular message recipient. Breaking this process is more difficult for the hacker because it contains the properties of photons in quantum mechanics. If any changes occur in the quantum key, it will become unsuccessful. The attempted snooper ends with broken keys and receiver also receives that key, so that two parties know that someone is doing ford with them. World First Unhackable Quantum Messaging: China will launch a worlds first unhackable quantum messaging and file sharing service. The highly secure quantum communication system is in use from August in Jinan city of China. The system allows around 200 people of government, military, and finance to establish highly secure communicate all over the network. The communication through quantum will be longest and most secure in the world, and it will travel 2000 km from Beijing to Shanghai through the message hub in Jinan, and it has the capability of encrypting 4000 pieces of data per second. Find freelancers who have basic knowledge of Unhackable systems. The network required 120 million Yuan for creation, and it has gone from 50 tests since May. For transmitting a message, the network uses quantum key distribution. The network is more secure than telephone cables and current internet. The network starts its development since 2013 and last year china completed 2000km quantum links which connect the different city of China, Beijing-Shanghai-Jitana-Hafai-Anhui. When exchanging files, faxes, and secure telephone communication, it transfers at 99% success rate. Why is Quantum network unhackable? Because it transfers information using light particles and also includes a high level of encryption called Quantum Entanglement. Entanglement is the key to the working of quantum computers. It is the network that would connect them and uses most sophisticated cryptography so that information gets exchanged securely. The quantum entanglement distance record is 97 Km in 2012 across Quighai Lake of China at University of Science and technology by quantum physicist Jian-Wei Pan. In a quantum network, it sends messages embedded in particles of light. If the third party attempts to hack the system, the light particle will disrupt due to their quantum nature and stop the communication and authorities get alerted. So the meaning of the message is impossible to read and interpret. Hire freelancers who work on the quantum network. In this article, you get information regarding what is a quantum network and how that system works. You also get information regarding how does current system work and provide security to your information. China will launch world’s first new system that is unhackable which work on the quantum network and provides high security to the system. In this article, you also get information regarding how china’s unhackable system works.
<urn:uuid:88560b09-d56f-4e94-b23d-55a186ae1c23>
CC-MAIN-2022-05
https://www.freelancinggig.com/blog/2017/08/01/china-build-longest-unhackable-quantum-network-messaging-system/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304528.78/warc/CC-MAIN-20220124094120-20220124124120-00038.warc.gz
en
0.907969
1,198
3.78125
4
Mankind Will Soon Be Able to Travel to Other Galaxies – Spaceships Faster than Lightspeed! In millennia past, people have progressively figured out more effective ways of getting from one place to another. Previously, long distances could only be traversed by horseback or even on foot, but today we have at our disposal a variety of modes of transportation, including cars, planes, trains, and even futuristic ships. So, while we’ve already made progress in developing faster and more efficient forms of transportation on Earth, the question now is how manned space flight will evolve in this regard. As is well known, a space probe launched from Earth takes many months, if not years, to reach its galactic target. It’s hard to say what the future holds for this venture. A fascinating question arises when we follow the thought experiment of breaking ever new cosmic speed records to its logical conclusion: Will there ever be light-speed ships? In today’s video, we will quickly take a closer look at this fascinating issue with you. But before we begin, kindly subscribe to this channel, like this video, and enable the notification feature If you haven’t already. Come on, let’s get started! Light travels quickly because it has a short path to travel. According to cosmological laws, nothing can move faster than light. In reality, it is the fastest object on the planet. There is no limit to the distance that light may travel; it travels at a speed of 186,000 miles per second. In the blink of an eye, light may travel from Los Angeles to New York City. Faster than any commercial airliner by more than 10,000 orders of magnitude. Proxima Centauri is the nearest star to Earth. It’s 4.25 light-years away, or 25 trillion miles away (40 trillion km). The Parker Solar Probe, which is already in orbit, will achieve a top speed of 450,000 mph, making it the fastest spacecraft ever. At such speed, it would take just 20 seconds to travel from Los Angeles to New York City, yet the solar probe would take 6,633 years to reach Earth’s nearest neighboring solar system. Everything in our Universe is bound by a few simple rules. The conservation of energy, momentum, and angular momentum is guaranteed anytime two quanta come into contact with each other. There are no differences between the physics of a forward-moving system of particles and its mirrored antiparticle counterpart when time is reversed in a mirror. Nothing can ever travel faster than the speed of light, and nothing with mass will ever be able to achieve this coveted feat. Many people have come up with innovative ways to get around this final restriction. In theory, tachyons have been introduced as hypothetical particles that could theoretically exceed the speed of light, but tachyons must have imaginary masses and do not exist in the real world. Although a sufficiently twisted space in General Relativity could produce other, shorter paths for light, there are no known wormholes in our physical universe. While quantum entanglement can produce “suspicious” behavior at a distance, no information can be transported faster than light. People will have to go faster than the speed of light if they ever hope to travel effortlessly between stars. However, until now, faster-than-light travel has only existed in science fiction. In Isaac Asimov’s Foundation series, humans can use jump drives to travel between planets, stars, and even the entire cosmos. Interstellar astronauts and Thor heroes exploit wormholes to travel across solar systems in just a few seconds. Warp drive technology is another option that “Star Trek” fans are familiar with. Theoretically, warp drives are conceivable, but a long way off. One of the many obstacles separating warp drive theory from reality was reported to have been surmounted in two recent studies published in March 2021. However, how do these speculative warp drives actually operate in reality? As for the future of humankind, will they be able to travel at warp speed? Albert Einstein’s General Relativity theory is the foundation of modern physics’ knowledge of spacetime. According to General Relativity, nothing can travel faster than the speed of light in the universe. Mass and energy can also cause spacetime to distort around massive objects, such as stars and black holes. Many space heroes are afraid of “falling into” or “getting stuck in” a gravity well because of this curvature. John Campbell and Isaac Asimov, among the first science fiction writers, regarded warping as a technique to get around the speed limit. Wouldn’t it be cool if a spaceship could shrink the volume of space around it while simultaneously growing the volume behind it? The warp drive from “Star Trek” is based on this concept. Mexican theoretical physicist Miguel Alcubierre demonstrated in 1994 that compressing spacetime in front of the spacecraft while expanding it behind was mathematically achievable under the laws of General Relativity. 2 Mythic In Cycle Reward | Mythic Outfit In Cycle Reward | New Hoverboard | Bgmi | Pubg Mobile HEY GUYS In this video i will show you 2 Mythic In Cycle Reward | Mythic Outfit In Cycle Reward | New Hoverboard | Bgmi | Pubg Mobile BGMI I'd :- 590078785 ID NAME :- 屮KAAL屮...
<urn:uuid:60ca2f31-c530-4c18-89ff-e1db70d13216>
CC-MAIN-2022-05
https://kiviac.com/2022/01/14/scientists-found-a-new-way-to-finally-travel-faster-than-light/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304872.21/warc/CC-MAIN-20220125190255-20220125220255-00118.warc.gz
en
0.934534
1,137
3.703125
4
In 2016 China launched “QUESS” (Quantum Experiments at Space Scale), a new type of satellite that it hopes will be capable of “quantum communications” which is supposed to be hack-proof, through the use of “quantum entanglement”. This allows the operator to ensure that no one else is listening to your communications by reliably distributing keys that are then used for encryption in order to be absolutely sure that there is no one in the middle intercepting that information. According the Chinese scientists involved in the project, quantum encryption is secure against any kind of computing power because information encoded in a quantum particle is destroyed as soon as it is measured. (According to Tibor Molnar a scientist at the University of Sydney), the only way to ‘observe’ a photon is to have it interact with (a) an electron, or (b) an electromagnetic field. Either of these interactions will cause the photon to “decohere” – i.e., interfere with it in a way that will be apparent to the intended recipient. Gregoir Ribordy, co-founder of Geneva-based quantum cryptography firm ID Quantique, likened it to sending a message written on a soap bubble. “If someone tries to intercept it when it’s being transmitted, by touching it, they make it burst.” Quantum physicists have recently advanced the use of photons to communicate securely over short distances – 50-150 km – on earth. The satellite, if successful, would vastly expand the range of unhackable communication. To test whether quantum communications can take place at a global scale, the Chinese team will attempt to beam a quantum cryptographic key through space from Beijing to Vienna. This topic was also discussed by a group of my international colleagues (USA, UK, Netherlands) and this is a summary of that discussion. Two of them assisted in explaining what this is all about, one worked on the first quantum key distribution network and one of the world’s best quantum computing teams is situated close to where he works. The two explained the differences between the various quantum technologies. - Quantum communications – sending information encoded in single photons (or equivalent) such that one can determine eavesdropping. Most useful for key exchange though has other uses. Sometimes called quantum key distribution networks. - Quantum cryptography – work to devise cryptographic algorithms that are not affected by the creation of quantum computers. (Generally “quantum cryptography” has tended to mean what is now called – in the context of the Chinese satellite – “quantum communications”). Post-quantum cryptography is the search for algorithms not rendered useless by quantum computation. - Quantum computing – a computer that harnesses quantum physics such that certain types of computation can be done more efficiently. There are still some doubts as to whether this is feasible. (Less so then before, but some say that it might be like nuclear fusion, not forbidden by physical laws, but hard to implement.) And, like fusion, if it could be made practical, certain types of cryptosystems (in particular, the RSA cryptosystem, but also the elliptic curve systems that have become widespread) would have to be abandoned. RSA encryption relies on the practical difficulty of factorising very large numbers, a task which is imagined to be very much easier (or at least faster) with quantum computers. But we do have substitute classical crypto systems that could be used that, as far as we know, are hard to break. A few other colleagues discussed the concept of “quantum entanglement”. As he explained intuitively you’d think this would work and provide a means of faster-than-light communication. However, it turns out that though the two particles are quantum entangled, you can’t actually convey any information between the two measurement points. Tibor added to this that even Quantum Key Distribution requires two-channel communication: one of “entangled photons” (which may be described as super-luminal), and another classical channel (which is definitely sub-luminal) advising which measurements of those photons are significant. To take a example, if you measure two quantum entangled photons and find the first photon is “spin up”, the second photon will always be “spin down” and vice-versa. Some clever statistics – the so-called “Bell Inequality” and its further elaboration, the “CHSH Inequality” – tells you they weren’t in this state to start with, it’s only the act of measuring that forces the first photon into this state, then instantly the second photon will be in the opposite state. Or so it seems: there are other interpretations, e.g., Quantum Bayesianism, but the effect is the same. I won’t go into details here, it’s a fairly long and difficult to get your head around the explanation as to how we know they weren’t in a particular state to start with. The mathematics (and in this example, intuition) also tell you that no information is conveyed from one location to the other by the measurement alone. The discussion also addressed the implications of this development. One of the experts commented: “This has zero practical significance”. Classical crypto is occasionally attacked, but the progress against the basic mathematical algorithms is seldom dramatic. Tibor added that it will become much more significant/dramatic when/if quantum computing becomes a reality, for then the most commonly used ‘classical’ cryptography techniques will no longer be secure. Practically all of the zillions of attacks that we hear about are at higher levels, implementation, protocols, … and, of course, human users (phishing, whaling). So the question could indeed be: why struggle to intercept/decrypt a message when you can just read the Post-It Note stuck on the sender’s screen? Paul Budde (standing on the shoulders of giants)
<urn:uuid:f0de229e-aa97-47fd-9cf9-4189dded6656>
CC-MAIN-2022-05
https://paulbudde.com/blog/telecommunications/quess-quantum-communications/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304883.8/warc/CC-MAIN-20220129092458-20220129122458-00679.warc.gz
en
0.9426
1,250
3.625
4
Before the advent of quantum physics, Albert Einstein, still thinking in the classical paradigm, thought that nothing in the universe could travel faster than light. In the past two decades, however, it has been experimentally proven that one thing can indeed move faster than the speed of light: information. Information can be sent between two objects at any distance instantaneously. This ground-breaking experiment conclusively proved the existence of “Quantum Entanglement” which is basically a fancy name for “instantaneous information travel.” First scientists took single photons and split them into separate “twin” particles with identical properties. Then they fired both particles away from each other in opposite directions through specially designed fiber-optic chambers. At the end of these long pathways, the twin particles were forced to choose between two random but exactly identical routes. Curiously, without fail, in every trial the particles made precisely the same choices and traveled the same paths. Classical physics has always assumed that separate particles have no communication with one another, but quantum physics has now proven that assumption erroneous. The first entanglement experiments were designed and tested in 1982 by French physicist Alain Aspect at Orsay’s Institut d’Optique. These crude but conclusive studies later inspired Nicholas Gisin’s University of Geneva group of physicists to replicate them at greater distances. In 1997 Gisin built a 14 mile fiber-optic chamber and repeated Aspect’s experiment with exactly the same results. Later in 2004 Gisin extended the chamber to 25 miles and once again, as usual, no matter how far apart, the particles always chose and traveled the same random pathways. “Quantum mechanics has shown through experimentation that particles, being after all but moving points on some infinite wave, are in communication with one another at all times. That is to say, if our quantum mechanic does something to particle A over in Cincinnati, Ohio, planet Earth, the experience of this event will be instantly communicated to particle Z, at speeds faster than light, over in Zeta Reticuli. What this suggests is that anything one given particle experiences can be experienced by another particle simultaneously, and perhaps even by all particles everywhere. The reason for this is that they are all part of the same wave, the same energy flow.” –Jake Horsley, “Matrix Warrior” (90-91) “For a message to travel between them, it would have to be moving faster than the speed of light. But according the Einstein’s theory of relativity, nothing can travel that quickly. So is it possible that these particles are violating the laws of physics … or are they demonstrating something else to us? Could they be showing us something so foreign to the way we think about our world that we’re still trying to force the mystery of what we see into the comfortable familiarity of how we believe energy gets from one place to another? What if the signal from one photon never traveled to reach the other? Is it possible that we live in a universe where the information between photons, the prayer for our loved ones, or the desire for peace in a place halfway around the world never needs to be transported anywhere to be received? The answer is yes! This appears to be precisely the kind of universe we live in.” -Gregg Braden, “The Divine Matrix” (105-6) Robert Nadeau, historian of science, and Menas Kafatos, a physicist from George Mason University wrote an entire book together on the results and implications of quantum entanglement and non-locality entitled, The Nonlocal Universe. In it they state, “All particles in the history of the cosmos have interacted with other particles in the manner revealed by the Aspect experiments … Also consider … that quantum entanglement grows exponentially with the number of particles involved in the original quantum state and that there is no theoretical limit on the number of these entangled particles. If this is the case, the universe on a very basic level could be a vast web of particles, which remain in contact with one another over any distance in ‘no time’ in the absence of the transfer of energy or information. This suggests, however strange or bizarre it might seem, that all of physical reality is a single quantum system that responds together to further interactions.” Nadeau and Kafatos argue that we live in a non-local universe which is the obvious conclusion from the quantum entanglement experiments. The fact is quanta can exchange information over any distance in the universe instantaneously. These entanglement experiments prove that Eintstein was incorrect in stating that nothing travels faster than light (186,000 miles per second). Quantum information “travels” at infinite speed “arriving” at its destination without any time elapsing. Here we see how the Newtonian/Einsteinian language of a local universe fails to describe our actual reality. It’s not that information is “traveling” at infinite “speed” to “arrive” at another location, but rather that the universe with all its so-called parts and particles is actually One non-local quantum system. Information from one particle to another doesn’t need to “travel” there because the space between them is illusory, as is the language of calling them “separate” particles. As we have seen, before observation quanta are not particles with definite attributes and location; they are merely waves in the One universal quantum ocean until our conscious observation individualizes the wave into droplets of experience. “Nonlocality shatters the very foundations of physics. Matter can no longer be considered separate. Actions do not have to have an observable cause over an observable space. Einstein’s most fundamental axiom isn’t correct: at a certain level of matter, things can travel faster than the speed of light. Subatomic particles have no meaning in isolation but can only be understood in their relationships. The world, at its most basic, exists as a complex web of interdependent relationships, forever indivisible.” -Lynne McTaggart, “The Field: The Quest for the Secret Force of the Universe,” (11) “As an aside, it’s interesting to note that Nadeau and Kafatos mention early in their book that readers accidentally encountering their book in the ‘new age’ section of a bookstore would likely be disappointed. That’s because the book is about physics and not new age ideas. But the fact that Nadeau and Kafatos felt it important to mention this at all illustrates the rising tension between the leading edge of interpretations in physics and the tail end of metaphysics. Physicists interested in quantum ontology are painfully aware that some interpretations of quantum reality are uncomfortably close to mystical concepts. In the eyes of mainstream science, to express sympathy for mysticism destroys one’s credibility as a scientist. Thus the taboo persists.” -Dean Radin, “Entangled Minds” (262)
<urn:uuid:6ae91225-527b-4001-b0b9-d173c36052d5>
CC-MAIN-2022-05
https://illuminatimindcontrol.com/nonlocality-quantum-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300658.84/warc/CC-MAIN-20220118002226-20220118032226-00041.warc.gz
en
0.936176
1,471
3.9375
4
Alternate format: ITSAP.40.016 Using encryption to keep your sensitive data secure (PDF, 391 KB) Encryption technologies are used to secure many applications and websites that you use daily. For example, online banking or shopping, email applications, and secure instant messaging use encryption. Encryption technologies secure information while it is in transit (e.g. connecting to a website) and while it is at rest (e.g. stored in encrypted databases). Many up-to-date operating systems, mobile devices, and cloud services offer built-in encryption, but what is encryption? How is it used? And what should you and your organization consider when using it? What is encryption? Encryption encodes (or scrambles) information. Encryption protects the confidentiality of information by preventing unauthorized individuals from accessing it. For example, Alice wants to send Bob a message, and she wants to ensure only he can read it. To keep the information confidential and private, she encrypts the message using a secret key. Once encrypted, this message can only be read by someone who has the secret key to decode it. In this case, Bob has the secret key. Eve is intentionally trying to intercept the message and read it. However, the message is encrypted, and even if Eve gets a copy of it, she can’t read it without acquiring the secret key. If an individual accidentally receives a message that includes encrypted information, they will be unable to read the encrypted contents without the key to decrypt the message. How is encryption used? Encryption is an important part of cyber security. It is used in a variety of ways to keep data confidential and private, such as in HTTPS websites, secure messaging applications, email services, and virtual private networks. Encryption is used to protect information while it is actively moving from one location to another (i.e. in transit) from sender to receiver. For example, when you connect to your bank’s website using a laptop or a smartphone, the data that is transmitted between your device and the bank’s website is encrypted. Encryption is also used to protect information while it is at rest. For example, when information is stored in an encrypted database, it is stored in an unreadable format. Even if someone gains access to that database, there’s an additional layer of security for the stored information. Encryption is also used to protect personal information that you share with organizations. For example, when you share your personal information (e.g. birthdate, banking or credit card information) with an online retailer, you should make sure they are protecting your information with encryption by using secure browsing. Many cloud service providers offer encryption to protect your data while you are using cloud based services. These services offer the ability to keep data encrypted when uploading or downloading files, as well as storing the encrypted data to keep it protected while at rest. When properly implemented, encryption is a mechanism that you and your organization can use to keep data private. Encryption is seamlessly integrated into many applications to provide a secure user experience. How can I use encryption? Your organization likely already uses encryption for many applications, such as secure browsing and encrypted messaging applications. If you access a website with padlock icon and HTTPS in front of the web address, the communication (i.e. the data exchanged between your device and the website’s servers) with the website is encrypted. To protect your organization’s information and systems, we recommend that you use HTTPS wherever possible. To ensure that users are accessing only HTTPS-supported websites, your organization should implement the web security policy tool HTTP Strict Transport Security (HSTS). HSTS offers additional security by forcing users’ browsers to load HTTPS supported websites and ignore unsecured websites (e.g. HTTP). Encrypted messaging applications Most instant messaging applications offer a level of encryption to protect the confidentiality of your information. In some cases, messages are encrypted between your device and the cloud storage used by the messaging service provider. In other cases, the messages are encrypted from your device to the recipient’s device (i.e. end-to-end encryption). When using end-to-end encryption services, not even the messaging service provider can read your encrypted messages. In deciding which tools to use, you need to consider both the functionality of the service and the security and privacy requirements of your information and activities. For further information, refer to protect how you connect. Encryption is just one of many security controls necessary to protect the confidentiality of data. What else should I consider? Encryption is integrated into many products that are commonly used by individuals and organizations to run daily operations. When choosing a product that uses encryption, we recommend that you choose a product that is certified through the Common Criteria (CC) and the Cryptographic Module Validation Program (CMVP). The CC and the CMVP list cryptographic modules that conform to Federal Information Processing Standards. Although the CC and the CMVP are used to vet products for federal government use, we recommend that everyone uses these certified products. The CCCS recommends The cccs recommends - Evaluate the sensitivity of your information (e.g. personal and proprietary data) to determine where it may be at risk and implement encryption accordingly. - Choose a vendor that uses standardized encryption algorithms (e.g. CC and CMVP supported modules). - Review your IT lifecycle management plan and budget to include software and hardware updates for your encryption products. - Update and patch your systems frequently. Prepare and plan for the quantum threat to cyber security. For more information, please see ITSE.00.017 Addressing the Quantum Computing Threat to Cryptography. Encryption for highly sensitive data Systems that contain highly sensitive information (e.g. financial, medical, and government institutions) require additional security considerations. Contact us for further guidance on cryptographic solutions for high-sensitivity systems and information: firstname.lastname@example.org.
<urn:uuid:c6adfcad-6c3d-41a7-9ebc-a06858a799ca>
CC-MAIN-2022-05
https://cyber.gc.ca/en/guidance/using-encryption-keep-your-sensitive-data-secure-itsap40016
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320302723.60/warc/CC-MAIN-20220121040956-20220121070956-00042.warc.gz
en
0.910774
1,266
3.5625
4
Nowadays, you might have heard about quantum computers many times and that they are the future of modern computing. But what is a quantum computer? Why don’t we already have them in our homes? So that’s why in this article, we will explain what these systems consist of, how they work, and when we can expect a more widespread implementation. First of all, you should bear in mind that the idea that quantum computers will end up replacing PCs is wrong. Using a ‘normal’ home PC is still the easiest and cheapest solution for most everyday problems and user needs, and will continue to be for a long, long time. However, quantum computers promise to drive technological advancements in many fields — from materials science to pharmaceutical research, which is why many companies are investing in developing this technology. What is a quantum computer, and how does it really work? Quantum computers take advantage of some of the almost ‘mystical’ phenomena of quantum mechanics to offer great advances in processing power — the idea is that a very simple quantum computer would be more powerful than the supercomputers that exist today. The secret of this type of equipment lies in its ability to generate and manipulate quantum bits, known as qubits. What are qubits, and how do they work? Today’s computers work with bits, which are nothing more than a stream of electrical (or optical) pulses that represent ones and zeros in the binary system. Everything from the emails you use to YouTube videos to this very article you’re reading is essentially long strings of binary digits. Quantum computers, in contrast, use qubits instead, which are subatomic particles like electrons or photons. Generating and managing qubits represents quite an engineering challenge, and companies like IBM or Google use superconducting circuits cooled to almost absolute zero for this, while other companies like IonQ manage them by trapping individual atoms in electromagnetic fields using silicon chips in chambers. In both cases, the main goal is to separate the qubits and keep them in a controlled quantum state. The most curious thing about these Qubits is that they can have both processing states at the same time or neither, which makes them tremendously difficult to predict, and everything will be based on approximations towards one state or another. Qubits have some peculiar quantum properties, and among them, the one that interests us the most is that when they form groups, they provide exponentially greater processing power than when bits are used in binary systems. These properties are called overlap and entanglement. The most remarkable peculiarity of qubits is that, unlike bits that can only be ones and zeroes, they are capable of having three states — one, zero, and one and zero simultaneously. This ability to represent several states at the same time is what is called superposition. And for the qubits to reach this state, it is necessary to manipulate them with precision lasers or microwave rays. Thanks to this phenomenon — which seems impossible, right? But that’s how quantum mechanics works. A quantum computer with several overlapping qubits can process a huge amount of calculation results simultaneously. The final result of a calculation is generated only after the qubits are measured, which can immediately cause their state to ‘collapse’ to a one or a zero. Engineers can generate pairs of qubits that are ‘entangled’ with each other, meaning that both pair members exist in a single quantum state. Changing the state to one of these qubits will immediately change the state of the other. And this will happen even if they are separated by long distances. No one knows very well how exactly this ‘mess’ works, and even the well-known Einstein defined it as a “creepy action at a distance“, but the fact is that it is key to the computing power of quantum computers. In a conventional computer, doubling the number of bits would double its processing power, while in a quantum machine, there is an exponential increase in its capacity. Thus, quantum computers take advantage of these qubits entangled in a kind of chain to work their magic. The ability of these machines to speed up calculations using specially designed quantum algorithms is the reason why there is so much expectation about their potential. That is good news. The bad news is that quantum computers are far more prone to miscalculation than normal computers due to another phenomenon — decoherence. The interaction of qubits with their environment sometimes causes their quantum behavior to decay and eventually disappear, called quantum decoherence. Their quantum state is extremely fragile, and the slightest vibration or temperature change — known by the term ‘noise’ — can cause qubits to ‘fall’ out of their superposition state before they have finished performing their job. For this reason, it is imperative that a quantum computer is totally isolated from the environment — humidity, temperature changes, vibrations, etc. — and therefore, it is necessary to put them in large refrigerators and vacuum chambers. However, these chambers and coolers are not perfect, and in the end, the noise causes errors in the calculations. Smart quantum algorithms compensate for some of these errors and adding extra qubits to each calculation also helps. Still, as they calculate, it takes thousands of standard qubits to create a single 100% reliable qubit, known as a ‘Logical Qubit.’ This, on the other hand, would greatly reduce the total computing power. And there lies the problem — until now, researchers have not been able to create environments of more than 128 standard qubits, so until now, it has been impossible to build a single logical qubit. As they calculate, we are decades away from being able to achieve it. What is the use of a quantum computer? One of the most promising applications of these systems is to simulate the behavior of matter at the molecular level. Automakers such as Volkswagen or Daimler (Mercedes-Benz) already use quantum computers to simulate electric car batteries’ chemical composition to find ways to improve their performance. Pharmaceutical companies use them to analyze and compare compounds that could lead to the creation of new medications. The machines are also excellent for solving optimization problems since, with their computing power, they can analyze a large number of possible solutions for any problem. For example, the Airbus company uses them to calculate more efficient ascent and descent routes for its planes. Volkswagen has already introduced a service that calculates the most optimal routes for buses and taxis in cities to avoid traffic jams. In any case, there are still many years — surely decades — until quantum computers can be fully viable, and indeed even longer until their use is standardized. There are some things like a quantum computer could not do certain current tasks that a common PC would solve without a problem. Another issue is software since a totally different series of programming is required, and the ENTIRE sector would have to migrate, something that cannot be done in a few years. Beyond certain environments, the reality is that the quantum computer is far from reaching us — ordinary users.
<urn:uuid:e2d78930-8637-42bc-8baa-1956bd5eec84>
CC-MAIN-2022-05
https://codeandhack.com/what-is-a-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320306181.43/warc/CC-MAIN-20220129122405-20220129152405-00283.warc.gz
en
0.954666
1,491
3.671875
4
The Classification of Matter (COM) programme is one of the main avenues of research into the physical world. It aims to identify the various properties of matter, from atomic particles to atoms, and then to infer their properties using experiments. This is known as general relativity, which was developed in the 1950s by the American physicist Richard Feynman. The main aim of the programme is to develop a general theory of gravity. The aim is to explain the properties of the world around us using a simple physical theory. It was originally developed to explain how matter and antimatter behave. In general relativity theory, matter and antiparticles interact and interact with each other, creating gravitational waves, and this interaction creates the effects we see in the cosmos. In some cases, the interactions are so powerful that gravity is observed. There are several ways of looking at this. One way is to say that matter and anti-matter interact in a way that is fundamentally different from the way we normally experience the world. The second way is that these interactions are completely different from what we experience, but that they nevertheless give rise to a property called the special theory of relativity (STT), which gives us the properties we observe in the universe. But there are also a few other ways to look at it, such as the classical special theory, which describes the properties and interactions of matter and space-time, and the quantum special theory (QFT), which describes quantum interactions between particles. The three are called the Classical, Quantum and Special. In terms of the Standard Model of particle physics, the Standard model is a description of the fundamental physics of the universe, which is the universe that we see. There is one difference between the Standard and the Standard models. The Standard Model assumes that all matter and energy in the observable universe exist in a single state. The Quantum Model assumes there are different states of matter or energy. The QFT assumes there is only one possible state of matter at any time, and it is this state that we observe. The classical models assume a universe where matter and matter’s interaction with each another is a constant, but there is no fixed state of mass or energy, and therefore there are a variety of possible states of mass and energy. If we are to understand the physical laws of the Universe, we must consider all possible states, but this requires us to look in all possible universes. We can only look at the Standard Models in the Standard Universe because the Standard Standard is so stable. But it is also possible to think of the Quantum Model as being more stable. Quantum mechanics is the study of the nature of particles. It describes the behavior of a particle as it interacts with a field, such that the particle is always moving in a direction which is different from that of the field. For example, a particle in the quantum world is always changing direction, and if the particle’s position is changed, the direction the particle will change is also changed. The two are the same. In the quantum theory, the two are not necessarily the same thing, but the particles behave in the same way as if they were. If you have a particle that is moving in the direction of a magnetic field, it is in the Quantum world, but if the magnetic field changes direction, the particle goes into the Standard world. There might be other possible states that we cannot account for, but we cannot see the particles as particles, because they are moving in opposite directions. The quantum world cannot be considered the Standard World because the two different states are not the same, and they do not interact with one another. The following table gives some examples of what is possible in the standard universe: We are looking for a point P in space P and a point Q in time Q, where P is a particle, and Q is a point in time. Let us suppose we are in the position P at time t and let’s call the point P a particle. We would expect the particle to be in the field P if and only if it moves in a straight line. If it is a wave, then we would expect it to be moving in an opposite direction from the direction P. The question is, how does the particle interact with the field? How do the particles interact? The particle has a state that is known to be called a quantum field. A quantum field is one in which a particle is neither moving nor changing, but is simply a part of a system of quantum bits. The way that the particles in a quantum world interact is to be found in a special theory called quantum entanglement. The particles interact with their fields in a certain way, by changing the state of the particles. In a quantum entangled system, this is a property that is not always obvious. For instance, if the particles have the same quantum field, but in a different state, the quantum particles might be able to get along with each others states. But if the quantum particle’s state is
<urn:uuid:a9ab6209-c318-47a3-b4a1-6b905ea4f0fd>
CC-MAIN-2022-05
https://5raovat.com/2021/08/18/which-classification-of-matter-is-the-best/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320303868.98/warc/CC-MAIN-20220122164421-20220122194421-00404.warc.gz
en
0.94064
1,044
4.21875
4
After writing my post on basic electrical components I realized that batteries and transistors were going to require a good deal more research to understand adequately. Having completed my post on the former, the time has finally come to elucidate the foundation of modern electronics and computing: the humble transistor. The development of the transistor began out of a need to find a superior means of amplifying telephone signals sent through long-distance wires. Around the turn of the twentieth century American Telephone and Telegraph (AT&T) had begun offering transcontinental telephone service as a way of staying competitive. The signal boost required to allow people to talk to each other over thousands of miles was achieved with triode vacuum tubes based on the design of Lee De Forest, an American inventor. But these vacuum tubes consumed a lot of power, produced a lot of heat, and were unreliable to boot. Mervin Kelly of Bell Labs recognized the need for an alternative and, after WWII, began assembling the team that would eventually succeed. Credit for pioneering the transistor is typically given to William Shockley, John Bardeen, and Walter Brattain, also of of Bell Labs, but they were not the first people to file patents for the basic transistor principle: Julius Lilienfeld filed one for the field-effect transistor in 1925 and Oskar Hiel filed one in 1934. Neither man made much of an impact in the growing fields of electronics theory or electronics manufacturing, but there is evidence that William Shockley and Gerald Pearson, a co-worker at Bell Labs, did build a functioning transistor prototype from Lilienfeld’s patents. Shockley, Brattain, and Bardeen understood that if they could solve certain basic problems they could build a device that would act like a signal amplifier in electronic circuits by exploiting the properties of semiconductors to influence electron flow. Actually accomplishing this, of course, proved fairly challenging. After many failed attempts and cataloging much anomalous behavior a practical breakthrough was achieved. A strip of the best conductor, gold, was attached to a plastic wedge and then sliced with a razor, producing two gold foil leads separated by an extremely small space. This apparatus was then placed in contact with a germanium crystal which had an additional lead attached at its base. The space separating the two pieces of gold foil was just large enough to prevent electron flow. Unless, that is, current were applied to one of the gold-tipped leads, which caused ‘holes’ — i.e. spaces without electrons — to gather on the surface of the crystal. This allowed electron flow to begin between the base lead and the other gold-tipped lead. This device became known as the point-contact transistor, and gained the trio a Nobel Prize. Though the point-contact transistor showed promise and was integrated with a number of electrical devices it was still fragile and impractical at a larger scale. This began to change when William Shockley, outraged at not receiving the credit he felt he deserved for the invention of this astonishing new device, developed an entirely new kind of transistor based on a ‘sandwich’ design. The result was essentially a precursor to the bipolar junction transistor, which is what almost everyone in the modern era means by the term ‘transistor’. Under the Hood In the simplest possible terms a transistor is essentially a valve for controlling the flow of electrons. Valves can be thought of as amplifiers: when you turn a faucet handle, force produced by your hand is amplified to control the flow of thousands of gallons of water, and when you press down on the accelerator in your car, the pressure of your foot is amplified to control the motion of thousands of pounds of fire and steel. Valves, in other words, allow small forces to control much bigger forces. Transistors work in a similar way. One common type of modern transistor is the bipolar junction NPN transistor, a cladistic descendant of Shockley’s original design. It is constructed from alternating layers of silicon which are doped with impurities to give them useful characteristics. In its pure form silicon is a textbook semiconductor. It contains four electrons in its valence shell which causes it to form very tight crystal lattices that typically don’t facilitate the flow of electrons. The N layer is formed by injecting trace amounts of phosphorus, which contains five valence electrons, into this lattice. It requires much less energy to knock this fifth electron loose than it would to knock loose one of the four valence electrons in the silicon crystal, making the N layer semiconductive. Similarly, the P layer is formed by adding boron which, because of the three electrons in its valence shell, leaves holes throughout the silicon into which electrons can flow. It’s important to bear in mind that neither the P nor the N layers are electrically charged. Both are neutral and both permit greater flow of electrons than pure silicon would. The interface between the N and P layers quickly becomes saturated as electrons from the phosphorus move into the holes in the valence shell of the Boron. As this happens it becomes increasingly difficult for electrons to flow between the N and P layers, and eventually a boundary is formed. This is called the ‘depletion layer’ Now, imagine that there is a ‘collector’ lead attached to the first N layer and another ’emitter’ lead attached to the other N layer. Current cannot flow between these two leads because the depletion layer at the P-N junction won’t permit it. Between these two layers, however, there is a third lead, called a ‘base’, placed very near the P layer. By making the base positively charged electrons can overcome the P-N junction and begin flowing from the emitter to the collector. The key here is to realize that the amount of charge to the base required to get current moving is much smaller than the current flowing to the collector, and that current flow can be increased or decreased by a corresponding change in the current to the base. This is what gives the transistor its amplifier properties. Transistors and Moore’s Law Even more useful than this, however, is the ability of a transistor to act as a switch. Nothing about the underlying physics changes here. If current is not flowing in the transistor it is said to in cutoff, and if current is flowing in the transistor it is said to be in saturation. This binary property of transistors makes them ideally suited for the construction of logic gates, which are the basic components of every computer ever made. A full discussion of logic gate construction would be well outside the purview of this essay, but it is worth briefly discussing one popular concept which requires a knowledge of transistors in order to be understood. Named after Intel co-founder Gordon Moore, Moore’s Law is sometimes stated as the rule that computing power will double roughly every two years. The more accurate version is that the number of transistors which can fit in a given unit area will double every two years . These two definitions are fairly similar, but keeping the latter in mind will allow you to better understand the underlying technology and where it might head in the future. Moore’s law has held for as long as it has because manufacturers have been able to make transistors smaller and smaller. Obviously this can’t continue forever, both because at a certain transistor density power consumption and heat dissipation become serious problems, and because at a certain size effects like quantum tunneling prevent the sequestering of electrons. A number of alternatives to silicon-based chips are being seriously considered as a way of extending Moore’s Law. Because of how extremely thin it can be made, graphene is one such contender. The problem, however, is that the electrophysical properties of graphene are such that building a graphene transistor that can switch on and off is not straightforward. A graphene-based computer, therefore, might well have to develop an entirely different logical architecture to perform the same tasks as modern computers. Other potentially fruitful avenues are quantum computing, optical computing, and DNA computing, all of which rely on very different architectures than conventional Von-Neumann computers. As I’m nearing the 1500 word mark I think I’ll end this essay here, but I do hope to return to these advanced computing topics at some point in the future 🙂 More on transistors:
<urn:uuid:77d26bc9-d112-4507-9d93-2abfc539adc8>
CC-MAIN-2022-05
https://rulerstothesky.com/2016/07/23/the-stempunk-project-transistors/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320302622.39/warc/CC-MAIN-20220120190514-20220120220514-00205.warc.gz
en
0.961446
1,730
3.640625
4
Light-storing chip charted Technology Research News For years, researchers have been striving to make high-speed, low-power chips that channel light rather than electricity, but finding ways to briefly store light pulses has proved extremely challenging. Recently, researchers have stored light pulses for fractions of a second in hot gases, extremely cold gases or crystal doped with special metal. But these techniques are challenging to carry out, and would be difficult or impossible to configure into more practical chip form. Researchers at Stanford University have come up with a scheme to store light pulses under ordinary conditions using photonic crystal -- semiconductor chips that contain regularly spaced holes or rods of a different material. "Our discovery enables quantum coherent storage of light pulses on a microchip about the size of the grain of salt," said Mehmet Fatih Yanik, a research assistant at Stanford University. The scheme could lead to inexpensive chips that power all-optical communications switches, quantum computers and quantum communications devices. "Operating wavelengths[and] bandwidths... can simply be designed by standard lithographic techniques used in conventional microchip technologies," The method would allow light pulses to be stored in microchips at room temperature without requiring any special light-matter interactions, The researchers' findings run counter to the conventional wisdom that devices using optical resonators -- tiny structures that vibrate at light frequencies -- can do no more than slow light by a limited amount. In one type of device, for example, light pulses at the telecommunications wavelength of 1.55 microns and a rate of 10 gigabits per second can be slowed to no less than one hundredth the speed of light in a vacuum, said The key to the researchers' method is a technique that allows them to change -- on-the-fly -- the way portions of the photonic crystal respond to light. "We discovered a practical way to compress light's bandwidth by an unlimited amount... using conventional optoelectronics technologies at speeds sufficient to prevent light pulses [from] passing through our system," said Yanik. The researchers' simulation shows that light pulses can be slowed to less than 10 centimeters per second, slow enough that the pulses would be held essentially in place for tiny fractions of a second, according to Yanik. This is long enough to make pulses interact to switch light signals for high-speed communications or link photons for quantum computing. The researchers' light-controlling chip design calls for photonic crystal that contains a series of optical resonators, or cavities. Photonic crystal refracts, or bends, light -- the same effect that produces the familiar bent-drinking-straw illusion. The boundaries made by photonic crystal's holes or rods refract light, and the spacing of these gaps determines the degree to which a given wavelength of light is bent. Photonic crystal can be designed to block or channel specific wavelengths. In the researchers' design, one series of cavities forms a straight waveguide that allows light pulses to pass through the device. Each cavity in the waveguide is attached to a side cavity that connects to a second The chip would briefly trap a pulse by changing the microcavities' resonant frequencies. Tuning the waveguide to resonate at the same frequency as the light pulse and at the same time keeping the side cavities out of tune would allow the pulse to enter the device. Once the pulse is inside the device, the waveguide would be gradually -- though at very high speed -- detuned while the side cavities were tuned to the pulse frequency. This would shunt the pulse into the side cavities. Reversing the tuning-detuning process would release the pulse into the waveguide, allowing it to continue on its way through the device. Key to the method is a way to tune the refractive index of the photonic crystal in a way that preserves the shape of the pulse. Light pulses contain multiple wavelengths, and the wavelengths bend to different degrees as pulses travel through matter. This disperses the wavelengths, causing light pulses to spread out, which limits the distance they can travel through a material. Wavelength dispersion also limits the amount light pulses can be slowed, because they can spread only so much before The researchers' technique tunes a device's refractive index in a way that lowers the frequency of all of the pulse's wavelengths consistently, preserving the pulse. A set of 120 microcavities whose tunings change at a maximum rate of one gigahertz is sufficient to store and release a light pulse, according to Yanik. Multiple light pulses could be stored simultaneously in the device, and specific pulses could be released on demand, he said. The researchers' scheme could also applied to other systems that involve resonance, said Yanik. It could be used to slow and store microwave signals and ultrasound waves, and possibly detect gravitational waves, The technique is an advance over previous work on stopped light because it uses microscopic optical cavities rather than atoms, said Raymond Chiao, a professor of physics at the University of California at Berkeley. "This allows much larger bandwidths of light to be stopped." The work would have been more impressive had the authors demonstrated the stopping of light experimentally, he added. The researchers are aiming to demonstrate their technique by trapping microwave signals. A demonstration should take place within a year, and a practical prototype that works at optical frequencies could be made in two to five years, said Yanik. Yanik research colleague was Shanhui Fan. The work is slated for publication in Physical Review Letters. The research was funded by the National Science Foundation (NSF) and Stanford University. Timeline: 2-5 years Funding: Government, University TRN Categories: Optical Computing, Optoelectronics and Photonics Story Type: News Related Elements: Technical paper, "Stopping Light All-Optically," posted at the arXiv physics archive at arxiv.org/abs/quant-ph/0312027 February 11/18, 2004 Light-storing chip charted up mental error Noise boosts nanotube Web users re-visit in DNA sorts nanotubes storage goes low power Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:ea10d545-383b-42b8-85af-a5a74250dd9f>
CC-MAIN-2022-05
http://trnmag.com/Stories/2004/021104/Light-storing_chip_charted_021104.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320305242.48/warc/CC-MAIN-20220127072916-20220127102916-00486.warc.gz
en
0.891677
1,523
3.8125
4
Building a large-scale physical quantum computer is still challenging. When scaling up qubits, wiring diagrams get increasingly more complicated. Bogdan Govoreanu, quantum computing program manager at imec, presents a smart way of interconnecting neighboring silicon qubits in a 2D bilinear array. This architecture tackles the qubit connectivity problem and is a potential pathway for realizing a quantum computer. The array design, together with device geometrical requirements analysis based on advanced multiscale modeling, is presented in a paper at IEDM 2021. How to build a large quantum computer Quantum computers leverage the properties of quantum physics to process larger amounts of data significantly faster than classical computers. The basic units, quantum bits or qubits, simultaneously exist in two states making it possible to sift through a vast number of potential outcomes at once. Silicon-based qubits are very attractive for potential use in quantum computers because they are compatible with the well-established processes of high-volume manufacturing in the semiconductor industry. Nevertheless, scaling up the number of qubits remains a roadblock for building large-scale quantum computers. While small arrays have been demonstrated, a practical design that scales to the requirements where it outperforms classical computers is still lacking. One bottleneck to developing larger quantum computers is the problem of how to arrange qubits. Efficient quantum algorithms require 2D-arrays where qubits can interact with their neighbors and be accessed by external circuits and devices. Each qubit needs dedicated lines for control and readout and a small pitch of typically tens of nanometers between two qubits. Increasing the number of qubits makes it therefore difficult to access the qubits at the center of the array. “We propose an elegant solution to this challenge: a bilinear 2D design for silicon qubits where each qubit connects to four other qubits,” tells Bogdan Govoreanu. “This architecture yields a compact array where different qubit coupling mechanisms are combined to achieve an overall connectivity of four, for each qubit in the bilinear array.” Solving the qubit connectivity problem “Our design is based on topologically mapping a 2D square lattice, to form a so-called bilinear design, where alternating rows of the lattice are shifted into two rows or 1D arrays (see Figure 1). By arranging the qubits in two rows they always remain addressable while maintaining the target connectivity of four in the equivalent 2D square lattice array. These arrays are also easily scalable as we only need to grow them in one dimension, along the rows,” explains Bogdan Govoreanu. “The connections between the two 1D arrays do not intersect because they are wired on two different planes, separated by a ground plane to isolate them from each other (Figure 2).” In this architecture, each qubit corresponds to the spin orientation of an electron confined in a potential well, called a quantum dot. Coupling these qubits is necessary for ‘quantum entanglement’, a property that underlies the exponential computing power of quantum computers. Entangled qubits store all the possible combinations of the quantum states of each qubit (e.g. for two qubits, this results in four values). The quantum dots within a 1D array are coupled through the spin interaction between electrons in nearby quantum dots, where nearby electron spins naturally interact through a quantum mechanical process called exchange coupling. The quantum dots between the 1D arrays are coupled over a long distance (~mm) via a microwave resonator, fabricated using superconducting materials. Such a long range is possible since the qubit state can be coupled to the photonic mode of the resonator, when the qubit electron is delocalized between two quantum dots. The quantum states are very fragile and prone to error. That is why building a large quantum computer is not just about scaling up the number of qubits, it is also about how resistant they are to errors. Since quantum computers cannot use the same error-correcting algorithms as classical computers, they fall back on quantum error correction techniques with ‘logical qubits’ - a complex arrangement of thousands of physical qubits that is used to encode a single qubit. “Our design is compatible with the widely accepted quantum error correction scheme, the surface code, that can run algorithms tolerating up to certain qubit error” explains Bogdan Govoreanu. “The typical number of physical qubits to implement a logical qubit is believed to be somewhere between 103 to 104, depending on the quality of the physical qubits. Hundreds to thousands of logical qubits are necessary for running practical large-scale algorithms, which implies that overall physical qubit numbers can exceed a million. In this paper, we characterized the relevant quantum resources needed for viable quantum error correction, along with providing a detailed analysis of the required device dimensions, tolerable noise specifications and quantum gate operations times in the structure (Figure 3). The bilinear architecture needs an extremely compact quantum logic area of around 36mm2 even for a system with a million qubits. Moreover, the resonators and the electrostatic gates defining the quantum dots are easily accessible from both sides in the bilinear array, thereby considerably reducing the wiring fanout complexity.” “This design is compatible with current CMOS fabrication technologies and can thus open the path for a future demonstration of large-scale silicon quantum computers,” concludes Bogdan Govoreanu. Want to know more? - Follow imec's presence at the 2021 IEEE International Electron Devices Meeting - Read all details of the novel device architecture in the paper entitled “Large-Scale 2D Spin-Based Quantum Processor with a Bi-Linear Architecture” by F.A. Mohiyaddin, R. Li, S. Brebels, G. Simion, N. I. Dumoulin Stuyck, C. Godfrin, M. Shehata, A. Elsayed, B. Gys, S. Kubicek, J. Jussot, Y. Canvel, S. Massar, P. Weckx, P. Matagne, M. Mongillo, B. Govoreanu and I. P. Radu, presented at IEDM 2021, which can be requested here. Bogdan Govoreanu is Quantum Computing Program Manager at imec, where he coordinates the technical research and development program activities. Prior to this, his research work included various topics in Memory Technology and Emerging Devices, with focus on Flash and resistive switching memory technologies, selectors and neuromorphic computing. He developed strong interests in novel concepts for computing and storage beyond current mainstream technologies. As of 2017, his research focuses on Quantum Computing Technologies and Systems. Bogdan holds a Ph.D. degree in Applied Sciences from KU Leuven for his research performed at imec on novel tunnel barrier concepts. More about these topics: 11 December 2021
<urn:uuid:71d26dac-6dfd-4ee5-9f51-1aec9e84404b>
CC-MAIN-2022-05
https://www.imec-int.com/en/articles/connecting-quantum-dots-bilinear-2d-device-architecture-large-scale-silicon-based-quantum
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320302740.94/warc/CC-MAIN-20220121071203-20220121101203-00328.warc.gz
en
0.899722
1,453
3.515625
4
Quantum Inspire, hosted by QuTech, a collaboration of Delft University of Technology and TNO, the Netherlands Organization for Applied research, consists of two independent quantum processors, Spin-2 and Starmon-5, and a quantum simulator. Anyone can create an account, use the Web interface to write a quantum algorithm, and have it executed by one of the processors in milliseconds (if there is no queue), with the result returned within a minute. The process is fully automated. Seen from the outside, Spin-2 and Starmon-5 are two large, cylindrical cryostats hanging from the ceiling in a university building. One floor up, a man-size stack of electronics for each takes care of the cooling, feeding the quantum processor input from users and reading out the results. Usually, there is no one in these rooms. The facility officially went online on April 20, and over 1,000 accounts have been created since then. Though many curious visitors never returned, active users now upload about 6,000 jobs a month to be executed. A quantum computer uses qubits for its computations. A qubit can be a single electron, or an electronic circuit, that has two quantum energy states, which correspond to the 0 and 1 of a classical bit of information. However, the magic of quantum physics enables a qubit to be in a superposition 0 and 1 at the same time, and N qubits that are prepared in a superposition can represent all 2N combinations of these 0s and 1s simultaneously. In a sense, its capacity doubles with every qubit added, so a quantum computer with just 50 error-free qubits could achieve an enormous speed-up compared to a standard computer. With only two qubits, Spin-2 is mainly a proof of principle, for it is the first time that qubits based on electron spins are accessible online. These consist of a single electron each, trapped in a 'quantum dot', a nanoscale structure on a silicon chip cooled to 0.02 Kelvin (degrees above absolute zero). An electron can be in a superposition of two spin states, 'up' and 'down'. Calibration of the relatively unstable quantum dots still needs manual tuning: every four hours, the system has 20 minutes of downtime. According to Richard Versluis, principal systems engineer at QuTech and Spin-2 lead, operating spin qubits is currently more difficult than other types of qubits, but they promise advantages in the long run: "They can be built with standard chip technology, and they are so tiny that millions would fit on one chip. The promise is very big, the challenge is also very big." Starmon-5 has five transmon qubits. A transmon is a superconducting electronic circuit that can be switched between two tunable energy states. With five stable, entangled qubits, Starmon-5 can execute short quantum algorithms composed from a universal set of quantum gates, equivalent to the operations of classical computer logic, like AND and NOT. Quantum Inspire does not yet offer sufficient qubits to achieve 'quantum supremacy', which means quickly performing a calculation that would take a classical computer thousands of years. Google claimed to have accomplished this last October with it 53-qubit Sycamore machine, although IBM disputed that claim. IBM itself offers Quantum Experience, an online quantum computer with 16 transmon qubits accessible for free. Said Leonardo di Carlo, QuTech co-founder and Starmon-5 lead, "For us, the educational aspect is important. We plan to include this in our teaching to our students in the fall." Starmon-5 is a useful testbed for quantum computations, for the correct execution of quantum software is less straightforward than running a few lines of code on a regular computer. Said Di Carlo, "I am an avid user of Quantum Inspire myself. I'm especially interested in learning how our quantum computers can improve from the end-user perspective." Even a 5-qubit quantum computer that performs reliably is a huge step forward, says Di Carlo. "For example, Starmon-5 executes Grover's search algorithm with 88% success probability. The first time someone did this, in 2012, the success rate was 59%, and that was considered a breakthrough." Grover's algorithm, which does searches similar to looking up a name in a phone directory, is one of the few quantum algorithms that is proven to outperform all classical search algorithms, so it is often used as a test. Also, Starmon-5 was designed with modularity in mind: a chip with many more transmon qubits would fit into the same cryostat. Di Carlo is cautious about making predictions when such a larger system will become available, but increasing the number of qubits is a key priority. While QuTech is developing quantum computing hardware, another Dutch collaboration, QuSoft, works on the quantum software that will run on these computers. Koen Groenland, until recently doing a postdoc on 'Few qubit applications' at QuSoft, says Quantum Inspire is an interesting proof of concept, but still too small for serious research applications. According to Groenland, the minimum size for applications will be somewhere between 50 and 200 qubits. Nevertheless, even with five qubits, valuable experience can be gained. Other than classical bits, which can only be switched from 0 to 1 or vice versa, operations on qubits resemble continuous rotations. Said Groenland, "Lots of variations are possible even with only a few qubits, and this allows you to fine-tune your quantum algorithm till it does what you want." In quantum computing, size gets most of the attention, but quality matters as well, Groenland said. "On the one hand, you want more qubits, but you also want qubits that stay in superposition long enough to execute an algorithm with many consecutive operations." Arnout Jaspers is a freelance science writer based in Leiden, the Netherlands. No entries found
<urn:uuid:56decdff-b00e-44df-9f99-85ca9e3c3dde>
CC-MAIN-2022-05
https://cacm.acm.org/news/248166-first-european-quantum-computing-facility-goes-online/fulltext
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320305242.48/warc/CC-MAIN-20220127072916-20220127102916-00489.warc.gz
en
0.943383
1,252
3.59375
4
Nearly a century after the dark matter was first proposed to explain the motion of galaxy clusters, physicists still have no idea what it’s made of. Researchers around the world have built dozens of detectors in hopes of discovering dark matter. As a graduate student, I helped design and operate one of these detectors, aptly named HAYSTAC. But despite decades of experimental effort, scientists have yet to identify the dark matter particle. Now, the search for dark matter has received an unlikely assist from technology used in quantum computing research. In a new paper published in the journal Nature, my colleagues on the HAYSTAC team and I describe how we used a bit of quantum trickery to double the rate at which our detector can search for dark matter. Our result adds a much-needed speed boost to the hunt for this mysterious particle. Scanning for a dark matter signal There is compelling evidence from astrophysics and cosmology that an unknown substance called dark matter constitutes more than 80% of the matter in the universe. Theoretical physicists have proposed dozens of new fundamental particles that could explain dark matter. But to determine which – if any – of these theories is correct, researchers need to build different detectors to test each one. One prominent theory proposes that dark matter is made of as-yet hypothetical particles called axions that collectively behave like an invisible wave oscillating at a very specific frequency through the cosmos. Axion detectors – including HAYSTAC – work something like radio receivers, but instead of converting radio waves to sound waves, they aim to convert axion waves into electromagnetic waves. Specifically, axion detectors measure two quantities called electromagnetic field quadratures. These quadratures are two distinct kinds of oscillation in the electromagnetic wave that would be produced if axions exist. The main challenge in the search for axions is that nobody knows the frequency of the hypothetical axion wave. Imagine you’re in an unfamiliar city searching for a particular radio station by working your way through the FM band one frequency at a time. Axion hunters do much the same thing: They tune their detectors over a wide range of frequencies in discrete steps. Each step can cover only a very small range of possible axion frequencies. This small range is the bandwidth of the detector. Tuning a radio typically involves pausing for a few seconds at each step to see if you’ve found the station you’re looking for. That’s harder if the signal is weak and there’s a lot of static. An axion signal – in even the most sensitive detectors – would be extraordinarily faint compared with static from random electromagnetic fluctuations, which physicists call noise. The more noise there is, the longer the detector must sit at each tuning step to listen for an axion signal. Unfortunately, researchers can’t count on picking up the axion broadcast after a few dozen turns of the radio dial. An FM radio tunes from only 88 to 108 megahertz (one megahertz is one million hertz). The axion frequency, by contrast, may be anywhere between 300 hertz and 300 billion hertz. At the rate, today’s detectors are going, finding the axion or proving that it doesn’t exist could take more than 10,000 years. Squeezing the quantum noise On the HAYSTAC team, we don’t have that kind of patience. So in 2012, we set out to speed up the axion search by doing everything possible to reduce noise. But by 2017 we found ourselves running up against a fundamental minimum noise limit because of a law of quantum physics known as the uncertainty principle. The uncertainty principle states that it is impossible to know the exact values of certain physical quantities simultaneously – for instance, you can’t know both the position and the momentum of a particle at the same time. Recall that axion detectors search for the axion by measuring two quadratures – those specific kinds of electromagnetic field oscillations. The uncertainty principle prohibits precise knowledge of both quadratures by adding a minimum amount of noise to the quadrature oscillations. In conventional axion detectors, the quantum noise from the uncertainty principle obscures both quadratures equally. This noise can’t be eliminated, but with the right tools, it can be controlled. Our team worked out a way to shuffle around the quantum noise in the HAYSTAC detector, reducing its effect on one quadrature while increasing its effect on the other. This noise manipulation technique is called quantum squeezing. In an effort led by graduate students Kelly Backes and Dan Palken, the HAYSTAC team took on the challenge of implementing squeezing in our detector, using superconducting circuit technology borrowed from quantum computing research. General-purpose quantum computers remain a long way off, but our new paper shows that this squeezing technology can immediately speed up the search for dark matter. Bigger bandwidth, faster search Our team succeeded in squeezing the noise in the HAYSTAC detector. But how did we use this to speed up the axion search? Quantum squeezing doesn’t reduce the noise uniformly across the axion detector bandwidth. Instead, it has the largest effect at the edges. Imagine you tune your radio to 88.3 megahertz, but the station you want is actually at 88.1. With quantum squeezing, you would be able to hear your favorite song playing one station away. In the world of radio broadcasting, this would be a recipe for disaster, because different stations would interfere with one another. But with only one dark matter signal to look for, a wider bandwidth allows physicists to search faster by covering more frequencies at once. In our latest result, we used squeezing to double the bandwidth of HAYSTAC, allowing us to search for axions twice as fast as we could before. Quantum squeezing alone isn’t enough to scan through every possible axion frequency in a reasonable time. But doubling the scan rate is a big step in the right direction, and we believe further improvements to our quantum squeezing system may enable us to scan 10 times faster. Nobody knows whether axions exist or whether they will resolve the mystery of dark matter; but thanks to this unexpected application of quantum technology, we’re one step closer to answering these questions.
<urn:uuid:a1355425-16da-4327-a627-aecb9eaacfe4>
CC-MAIN-2022-05
https://www.inverse.com/science/dark-matter-quantum-mechanics
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304928.27/warc/CC-MAIN-20220126071320-20220126101320-00250.warc.gz
en
0.925379
1,297
3.625
4
During the past months we’ve been reporting several breakthroughs in the field of quantum computing, and now IBM seems ready to truly pave the way for quantum computers. Researchers announced they are now able to develop a superconducting qubit made from microfabricated silicon that maintains coherence long enough for practical computation. Whoa! That probably sounds like a lot to swallow, so let’s break it down. Bits and Qubits Information is measured in ‘bits’, and a bit may have two positions (described typically as 0 or 1). Quantum computers however don’t use these bits, and instead they use quantum bits, or ‘qubits’. But while a bit must be a 0 or a 1, a qubit can be both 0, 1, or a superposition of both. This difference might seem small and subtle, but in fact, it is absolutely humongous: a mere hundred qubits can store more classical ‘bit’ information than there are atoms in the Universe. Needless to say a computer running on qubits would be game changing, in pretty much the same way microprocessors were in their days. But what makes quantum computing extremely difficult is a problem called ‘decoherence‘. In the quantum world, things don’t happen as they do in the ‘real world’; when a qubit will move from the 0 state to the 1 state or to a superposition, it will decohere to state 0 due to interference from other parts of the computer. Generally speaking, decoherence is the loss order of the phase angles between the components. So in order for quantum computers to be practical and scalable, the system would have to remain coherent for a long enough time to allow error-correction techniques to function properly. “In 1999, coherence times were about 1 nanosecond,” said IBM scientist Matthias Steffen. “Last year, coherence times were achieved for as long as 1 to 4 microseconds. With these new techniques, we’ve achieved coherence times of 10 to 100 microseconds. We need to improve that by a factor of 10 to 100 before we’re at the threshold we want to be. But considering that in the past ten years we’ve increased coherence times by a factor of 10,000, I’m not scared.” Two different approaches, one breakthrough IBM announced they took two different approaches, both of which played a significant part in the breakthrough they revealed. The first one was to build a 3-D qubit made from superconducting, microfabricated silicon. The main advantage here is that the equipment and know-how necessary to create this technology already exists, nothing new has to be invented, thanks to developments made by Yale researchers (for which Steffen expressed a deep admiration). Using this approach, they managed to maintain coherence for 95 microseconds – “But you could round that to 100 for the piece if you want,” Steffen joked. The second idea involved a traditional 2-D qubit, which IBM’s scientists used to build a “Controlled NOT gate” or CNOT gate, which is a building block of quantum computing. A CNOT gate connects two qubits in such a way that the second qubit will change state if the first qubit changes its state to 1. The CNOT gate was able to produce a coherence of 10 microseconds, which is long enough to show a 95% accuracy rate – a notable improvement from the 81% accuracy rate, the highest achieved until now. Of course, the technology is still years away from being actually on the shelves, but the developments are very impressive. From quantum to reality Given the rapid progress that is being made in the field of quantum computing, one can only feel that a quantum computer is looking more and more like a real possibility. As error correction protocols become more accurate and coherence times grow longer, we are moving more and more towards accurate quantum computing – but you shouldn’t expect a quantum smartphone just yet. “There’s a growing sense that a quantum computer can’t be a laptop or desktop,” said Steffen. “Quantum computers may well just being housed in a large building somewhere. It’s not going to be something that’s very portable. In terms of application, I don’t think that’s a huge detriment because they’ll be able to solve problems so much faster than traditional computers.” The next steps are simple, in principle, but extremely hard to do in practice. The accuracy rate has to be at at least 99.99%, up to the point where it achieves what is called a ‘logical qubit’ – one that, for practical purposes, doesn’t suffer decoherence. From that point, the only thing left to do is develop the quantum computer architecture, and this will prove troublesome too – but the reward is definitely worth it. “We are very excited about how the quantum computing field has progressed over the past ten years,” he told me. “Our team has grown significantly over past 3 years, and I look forward to seeing that team continue to grow and take quantum computing to the next level.”
<urn:uuid:8a12a97a-e36e-4f93-abc7-bd1804e3e877>
CC-MAIN-2022-05
https://www.zmescience.com/research/ibm-quantum-computer-28022012/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320300343.4/warc/CC-MAIN-20220117061125-20220117091125-00370.warc.gz
en
0.956899
1,117
3.84375
4
40 years ago, Nobel Prize-winner Richard Feynman argued that “nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical.” This was later perceived as a rallying cry for developing a quantum computer, leading to today’s rapid progress in the search for quantum supremacy. Here’s a very short history of the evolution of quantum computing. 1905 Albert Einstein explains the photoelectric effect—shining light on certain materials can function to release electrons from the material—and suggests that light itself consists of individual quantum particles or photons. 1924 The term quantum mechanics is first used in a paper by Max Born 1925 Werner Heisenberg, Max Born, and Pascual Jordan formulate matrix mechanics, the first conceptually autonomous and logically consistent formulation of quantum mechanics 1925 to 1927 Niels Bohr and Werner Heisenberg develop the Copenhagen interpretation, one of the earliest interpretations of quantum mechanics which remains one of the most commonly taught 1930 Paul Dirac publishes The Principles of Quantum Mechanics, a textbook that has become a standard reference book that is still used today 1935 Albert Einstein, Boris Podolsky, and Nathan Rosen publish a paper highlighting the counterintuitive nature of quantum superpositions and arguing that the description of physical reality provided by quantum mechanics is incomplete 1935 Erwin Schrödinger, discussing quantum superposition with Albert Einstein and critiquing the Copenhagen interpretation of quantum mechanics, develops a thought experiment in which a cat (forever known as Schrödinger’s cat) is simultaneously dead and alive; Schrödinger also coins the term “quantum entanglement” 1947 Albert Einstein refers for the first time to quantum entanglement as “spooky action at a distance” in a letter to Max Born 1976 Roman Stanisław Ingarden of the Nicolaus Copernicus University in Toruń, Poland, publishes one of the first attempts at creating a quantum information theory 1980 Paul Benioff of the Argonne National Laboratory publishes a paper describing a quantum mechanical model of a Turing machine or a classical computer, the first to demonstrate the possibility of quantum computing 1981 In a keynote speech titled Simulating Physics with Computers, Richard Feynman of the California Institute of Technology argues that a quantum computer had the potential to simulate physical phenomena that a classical computer could not simulate 1985 David Deutsch of the University of Oxford formulates a description for a quantum Turing machine 1992 The Deutsch–Jozsa algorithm is one of the first examples of a quantum algorithm that is exponentially faster than any possible deterministic classical algorithm 1993 The first paper describing the idea of quantum teleportation is published 1994 Peter Shor of Bell Laboratories develops a quantum algorithm for factoring integers that has the potential to decrypt RSA-encrypted communications, a widely-used method for securing data transmissions 1994 The National Institute of Standards and Technology organizes the first US government-sponsored conference on quantum computing 1996 Lov Grover of Bell Laboratories invents the quantum database search algorithm 1998 First demonstration of quantum error correction; first proof that a certain subclass of quantum computations can be efficiently emulated with classical computers 1999 Yasunobu Nakamura of the University of Tokyo and Jaw-Shen Tsai of Tokyo University of Science demonstrate that a superconducting circuit can be used as a qubit 2002 The first version of the Quantum Computation Roadmap, a living document involving key quantum computing researchers, is published 2004 First five-photon entanglement demonstrated by Jian-Wei Pan's group at the University of Science and Technology in China 2011 The first commercially available quantum computer is offered by D-Wave Systems 2012 1QB Information Technologies (1QBit), the first dedicated quantum computing software company, is founded 2014 Physicists at the Kavli Institute of Nanoscience at the Delft University of Technology, The Netherlands, teleport information between two quantum bits separated by about 10 feet with zero percent error rate 2017 Chinese researchers report the first quantum teleportation of independent single-photon qubits from a ground observatory to a low Earth orbit satellite with a distance of up to 1400 km 2018 The National Quantum Initiative Act is signed into law by President Donald Trump, establishing the goals and priorities for a 10-year plan to accelerate the development of quantum information science and technology applications in the United States 2019 Google claims to have reached quantum supremacy by performing a series of operations in 200 seconds that would take a supercomputer about 10,000 years to complete; IBM responds by suggesting it could take 2.5 days instead of 10,000 years, highlighting techniques a supercomputer may use to maximize computing speed The race for quantum supremacy is on, to being able to demonstrate a practical quantum device that can solve a problem that no classical computer can solve in any feasible amount of time. Speed—and sustainability—has always been the measure of the jump to the next stage of computing. In 1944, Richard Feynman, then a junior staff member at Los Alamos, organized a contest between human computers and the Los Alamos IBM facility, with both performing a calculation for the plutonium bomb. For two days, the human computers kept up with the machines. “But on the third day,” recalled an observer, “the punched-card machine operation began to move decisively ahead, as the people performing the hand computing could not sustain their initial fast pace, while the machines did not tire and continued at their steady pace” (see When Computers Were Human, by David Alan Greer).
<urn:uuid:d6d66360-ccb2-45e9-8a88-3ef00813f2a2>
CC-MAIN-2022-05
https://www.forbes.com/sites/gilpress/2021/05/18/27-milestones-in-the-history-of-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301475.82/warc/CC-MAIN-20220119155216-20220119185216-00691.warc.gz
en
0.889865
1,155
3.671875
4
Light and matter are typically viewed as distinct entities that follow their own, unique rules. Matter has mass and typically exhibits interactions with other matter, while light is massless and does not interact with itself. Yet, wave-particle duality tells us that matter and light both act sometimes like particles, and sometimes like waves. Harnessing the shared wave nature of light and matter, researchers at the University of Chicago, led by Jonathan Simon, the Neubauer Family Assistant Professor of Physics, have used light to explore some of the most intriguing questions in the quantum mechanics of materials. The topic encompasses complex and non-intuitive phenomena that are often difficult to explain in non-technical language, but which carry important implications to specialists in the field. In work published online this week in the journal Nature, Simon’s group presents new experimental observations of a quantum Hall material near a singularity of curvature in space. Quantum effects give rise to some of the most useful and promising properties of materials: They define standard units of measurement, give rise to superconductivity and describe quantum computers. The quantum Hall materials are one prominent example in which electrons are trapped in non-conducting circular orbits except at the edges of the material. There, electrons exhibit quantized resistance-free electrical conduction that is immune to disorder such as material impurities or surface defects. Furthermore, electrons in quantum Hall materials do not transmit sound waves but instead have particle-like excitations, some of which are unlike any other particles ever discovered. Some of these materials also exhibit simultaneous quantum entanglement between millions of electrons, meaning that the electrons are so interconnected, the state of one instantly influences the state of all others. This combination of properties makes quantum Hall materials a promising platform for future quantum computation. Researchers worldwide have spent the past 35 years delving into the mysteries of quantum Hall materials, but always in the same fundamental way. They use superconducting magnets to make very powerful magnetic fields and refrigerators to cool electronic samples to thousandths of a degree above absolute zero. In a new approach, Simon and his team demonstrated the creation of a quantum Hall material made up of light. “Using really good mirrors that are pointed at each other, we can trap light for a long time while it bounces back and forth many thousands of times between the mirrors,” explained graduate student Nathan Schine. In the UChicago experiment, photons travel back and forth between mirrors, while their side-to-side motion mimics the behavior of massive particles like electrons. To emulate a strong magnetic field, the researchers created a non-planar arrangement of four mirrors that makes the light twist as it completes a round trip. The twisting motion causes the photons to move like charged particles in a magnetic field, even though there is no actual magnet present. “We make the photons spin, which leads to a force that has the same effect as a magnetic field,” explained Schine. While the light is trapped, it behaves like the electrons in a quantum Hall material. First, Simon’s group demonstrated that they had a quantum Hall material of light. To do so, they shined infrared laser light at the mirrors. By varying the laser’s frequency, Simon’s team could map out precisely at which frequencies the laser was transmitted through the mirrors. These transmission frequencies, along with camera images of the transmitted light, gave a telltale signature of a quantum Hall state. Next, the researchers took advantage of the precise control that advanced optical systems provide to place the photons in curved space, which has not been possible so far with electrons. In particular, they made the photons behave as if they resided on the surface of a cone. ...Near a singularity “We created a cone for light, much like you might do by cutting a wedge of paper and taping the edges together,” said postdoctoral fellow Ariel Sommer, also a co-author of the paper. “In this case, we imposed a three-fold symmetry on our light, which essentially divides the plane into three wedges and forces the light to repeat itself on each wedge.” The tip of a cone has infinite curvature—the singularity—so the researchers were able to study the effect of strong spatial curvature in a quantum Hall material. They observed that photons accumulated at the cone tip, confirming a previously untested theory of the quantum Hall effect in curved space. Despite 20 years of interest, this is the first time an experiment has observed the behavior of quantum materials in curved space. “We are beginning to make our photons interact with each other,” said Schine. “This opens up many possibilities, such as making crystalline or exotic quantum liquid states of light. We can then see how they respond to spatial curvature.” The researchers say this could be useful for characterizing a certain type of quantum computer that is built of quantum Hall materials. “While quantum Hall materials were discovered in the ’80s, they continue to reveal their fascinating secrets to this day,” said Simon. “The final frontier is exploring the interplay of these beautiful materials with the curvature of space. That is what we’ve begun to explore with our photons.” Citation: “Synthetic Landau levels for photons,” Nature Advance Online Publication, June 8, 2016, by Nathan Schine, Albert Ryou, Andrey Gromov, Ariel Sommer and Jonathan Simon. DOI: 10.1038/nature17943. Funding: U.S. Department of Energy, Defense Advanced Research Projects Agency, Air Force Office of Scientific Research.
<urn:uuid:8b769729-bc6f-451e-accf-640e09d25624>
CC-MAIN-2022-05
https://news.uchicago.edu/story/uchicago-physicists-first-see-behavior-quantum-materials-curved-space
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320305420.54/warc/CC-MAIN-20220128043801-20220128073801-00452.warc.gz
en
0.935583
1,175
3.765625
4
Quantum teleportation is a technique for transferring quantum information from a sender at one location to a receiver some distance away. While teleportation is portrayed in science fiction as a means to transfer physical objects from one location to the next, quantum teleportation only transfers quantum information. For the first time, a team of scientists and researchers have achieved sustained, high-fidelity ‘quantum teleportation’ — the instant transfer of ‘qubits’ (quantum bits) the basic unit of quantum information. the collaborative team, which includes NASA’s jet propulsion laboratory, successfully demonstrated sustained, long-distance teleportation of qubits of photons (quanta of light) with fidelity greater than 90%. the qubits were teleported 44 kilometers (27 miles) over a fiber-optic network using state-of-the-art single-photon detectors and off-the-shelf equipment. Important point to keep in mind is quantum teleportation is the transfer of quantum states from one location to another using quantum entanglement, where the two particles in separate locations are connected by an invisible force, famously referred to as “spooky action at a distance” by Albert Einstein. Regardless of the distance, the encoded information shared by the “entangled” pair of particles can be passed between them. An interesting note is that the sender knows neither the location of the recipient nor the quantum state that will be transferred. By sharing these quantum qubits, the basic units of quantum computing, researchers are hoping to create networks of quantum computers that can share information at blazing-fast speeds. But keeping this information flow stable over long distances has proven extremely difficult due to changes in the environment including noise. Researchers are now hoping to scale up such a system, using both entanglement to send information and quantum memory to store it as well. On the same front, scientists have advanced their quantum technology research with the development of a chip that could be scaled up and used to build the quantum simulator of the future using nanochip that allows them to produce enough stable photons encoded with quantum information to scale up the technology. The chip, which is said to be less than one-tenth of the thickness of a human hair, may enable the scientists to achieve ‘quantum supremacy’ – where a quantum device can solve a given computational task faster than the world’s most powerful supercomputer. In quantum entanglement particles that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation. Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated. In July, the US Department of Energy unveiled a blueprint for the first quantum internet, connecting several of its National Laboratories across the country. A quantum internet would be able to transmit large volumes of data across immense distances at a rate that exceeds the speed of light. You can imagine all the applications that can benefit from such speed. Traditional computer data is coded in either zeros or ones. Quantum information is superimposed in both zeros and ones simultaneously. Academics, researchers and IT professionals will need to create devices for the infrastructure of quantum internet including: quantum routers, quantum repeaters, quantum gateways, quantum hubs, and other quantum tools. A whole new industry will be born based on the idea of quantum internet exists in parallel to the current ecosystem of companies we have in regular internet. The “traditional internet “, as the regular internet is sometimes called, will still exist. It is expected that large organizations will rely on the quantum internet to safeguard data, but that individual consumers will continue to use the classical internet. Experts predict that the financial sector will benefit from the quantum internet when it comes to securing online transactions. The healthcare sectors and the public sectors are also expected to see benefits. In addition to providing a faster, safer internet experience, quantum computing will better position organizations to solve complex problems, like supply chain management. Furthermore, it will expedite the exchange of vast amounts of data, and carrying out large-scale sensing experiments in astronomy, materials discovery and life sciences. Ahmed Banafa is an expert in new tech with appearances on ABC, NBC , CBS, FOX TV and radio stations. He served as a professor, academic advisor and coordinator at well-known American universities and colleges. His researches are featured on Forbes, MIT Technology Review, ComputerWorld and Techonomy. He published over 100 articles about the internet of things, blockchain, artificial intelligence, cloud computing and big data. His research papers are used in many patents, numerous thesis and conferences. He is also a guest speaker at international technology conferences. He is the recipient of several awards, including Distinguished Tenured Staff Award, Instructor of the year and Certificate of Honor from the City and County of San Francisco. Ahmed studied cyber security at Harvard University. He is the author of the book: Secure and Smart Internet of Things Using Blockchain and AI.
<urn:uuid:371b80f0-97e0-4aa6-99a9-89aa019dcd09>
CC-MAIN-2022-05
https://www.bbntimes.com/technology/quantum-teleportation-facts-and-myths
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320302740.94/warc/CC-MAIN-20220121071203-20220121101203-00332.warc.gz
en
0.929707
1,076
4.15625
4
Light and matter can interact in a number of different ways. Under certain conditions, light particles (photons) can affect the movement of atoms of matter. It happens because the emission and absorption of photons are followed by recoil. Such interactions are a study subject of one of the physics’ subdivisions, namely, quantum optomechanics. A study of how and under which conditions particles of matter interact with light has a great number of practical applications. The scope of its use can be even more expanded in the future as humanity is heading towards the creation of computing devices based on the interaction of photons rather than electors – for example, optical computers. “Imagine a chain of atoms placed in the vicinity of an optical waveguide through which photons can propagate. Each atom is a two-level system, meaning that it has two states – ground and excited. An atom can change its state due to the absorption of a photon or its emission. Such systems could be applied in the rapidly developing field of quantum computing,” explains Denis Sedov, a student at ITMO’s Faculty of Physics and Engineering.“ Strong coupling wanted However, it is impossible to use such models for creating prototypes unless scientists will find ways to tackle multiple challenges. One of these fundamental issues is the relatively weak interaction of light and atoms that happens to interfere with the effective control of the atoms’ state using light particles. Trying to solve this problem, a research team from ITMO University has created a theoretical model of a system with a strong-coupling regime. “In our work, we presented an optomechanical system with the possibility of intensive interaction,” says Denis Sedov, one of the researchers in the project. “It is a ring waveguide in which photons are able to propagate only clockwise. The atoms are located above the waveguide in optical traps. There, they not only interact with each other with the help of photons but also oscillate near their equilibrium positions.” Although similar systems have been studied in the past, such a ring unidirectional geometry of a waveguide and a full-fledged quantum accounting of atomic vibrations were considered for the first time. The study made it possible to obtain new and unexpected results. It turned out that when two atoms are located above a completely chiral or, basically, a unidirectional waveguide, the considered model is an equivalent of the well-known and actively studied quantum Rabi model. The latter describes the interaction of a two-level system placed in an optical resonator (an arrangement of mirrors) and the electromagnetic field of the resonator. This model has a Z2 symmetry, and the states describing this system can be divided into two categories: while some are characterized by an even number of excitations, others – by an odd number. Interestingly enough, this symmetry is mathematically similar to a 180-degree rotation of a figure, and two consecutive rotations are equivalent to no rotation. Even if the waveguide has no chirality, many of the obtained properties of the system are retained but non-chirality leads to new unexplored models that have yet to be investigated in the future. The research has also proved the presence of Z3 symmetry in a three atom system. “In simple cases, Z3 can be understood as symmetry over rotations of 120, 240, and 360 degrees, which make the system transform into itself. But although we have symmetry with a more complex nature and description, the principle remains unchanged,” says Denis Sedov, a student at ITMO University. There is also a quantum phase transition in the system. It is a type of transformation that makes some properties of the system drastically change, for instance, an abrupt change in density during ice melting. The difference between a quantum phase transition and the classical one is that the first one occurs at absolute zero. In the studied model, the optomechanical coupling – which describes the interaction between photons and the mechanical motion of atoms – reaching a critical value results in the occurrence of a quantum phase transition. It is accompanied by the phenomenon of self-organization of atoms over a waveguide. The process is that atoms communicate with one another through photons each going in their own specific directions. “Under strong optomechanical coupling, the system is in the multicomponent Schrödinger-cat ground state, that is, in a superposition of multiple classical states of the atoms’ motion,” says Valerii Kozin, a PhD student at ITMO’s Faculty of Physics and Engineering. “Such systems can be used to create error-tolerant protocols for storing and processing quantum information.” One of the major problems in creating devices for storing and processing quantum information (quantum computers) is that quantum states are extremely fragile and should be isolated from the environment. Therefore, scientists are now challenged to figure out how to store quantum information so that it remains resilient to errors. And here they may turn to the multicomponent Schrödinger-cat states that arise naturally in the proposed systems. This paper was published in Physical Review Letters. Reference: D. D. Sedov, V. K. Kozin, and I. V. Iorsh. Chiral Waveguide Optomechanics: First Order Quantum Phase Transitions with Z3 Symmetry Breaking. Physical Review Letters, 2020/10.1103/PhysRevLett.125.263606
<urn:uuid:38c3c21b-c8ea-430b-b6fa-1111ef9820c5>
CC-MAIN-2022-05
https://news.itmo.ru/en/science/photonics/news/10064/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320305006.68/warc/CC-MAIN-20220126222652-20220127012652-00655.warc.gz
en
0.940543
1,138
3.625
4
Credit: University of Rochester photo / J. Adam Fenster Quantum computing has the potential to revolutionize technology, medicine, and science by providing faster and more efficient processors, sensors, and communication devices. But transferring information and correcting errors within a quantum system remains a challenge to making effective quantum computers. In a paper in the journal Nature, researchers from Purdue University and the University of Rochester, including John Nichol, an assistant professor of physics, and Rochester PhD students Yadav P. Kandel and Haifeng Qiao, demonstrate their method of relaying information by transferring the state of electrons. The research brings scientists one step closer to creating fully functional quantum computers and is the latest example of Rochester’s initiative to better understand quantum behavior and develop novel quantum systems. The University recently received a $4 million grant from the Department of Energy to explore quantum materials. A quantum computer operates on the principles of quantum mechanics, a unique set of rules that govern at the extremely small scale of atoms and subatomic particles. When dealing with particles at these scales, many of the rules that govern classical physics no longer apply and quantum effects emerge; a quantum computer is able to perform complex calculations, factor extremely large numbers, and simulate the behaviors of atoms and particles at levels that classical computers cannot. Quantum computers have the potential to provide more insight into principles of physics and chemistry by simulating the behavior of matter at unusual conditions at the molecular level. These simulations could be useful in developing new energy sources and studying the conditions of planets and galaxies or comparing compounds that could lead to new drug therapies. “You and I are quantum systems. The particles in our body obey quantum physics. But, if you try to compute what happens with all of the atoms in our body, you cannot do it on a regular computer,” Nichol says. “A quantum computer could easily do this.” Quantum computers could also open doors for faster database searches and cryptography. “It turns out that almost all of modern cryptography is based on the extreme difficulty for regular computers to factor large numbers,” Nichol says. “Quantum computers can easily factor large numbers and break encryption schemes, so you can imagine why lots of governments are interested in this.” BITS VS. QUBITS A regular computer consists of billions of transistors, called bits. Quantum computers, on the other hand, are based on quantum bits, also known as qubits, which can be made from a single electron. Unlike ordinary transistors, which can be either “0” or “1,” qubits can be both “0” and “1” at the same time. The ability for individual qubits to occupy these “superposition states,” where they are simultaneously in multiple states, underlies the great potential of quantum computers. Just like ordinary computers, however, quantum computers need a way to transfer information between qubits, and this presents a major experimental challenge. “A quantum computer needs to have many qubits, and they’re really difficult to make and operate,” Nichol says. “The state-of-the art right now is doing something with only a few qubits, so we’re still a long ways away from realizing the full potential of quantum computers.” All computers, including both regular and quantum computers and devices like smart phones, also have to perform error correction. A regular computer contains copies of bits so if one of the bits goes bad, “the rest are just going to take a majority vote” and fix the error. However, quantum bits cannot be copied, Nichol says, “so you have to be very clever about how you correct for errors. What we’re doing here is one step in that direction.” Quantum error correction requires that individual qubits interact with many other qubits. This can be difficult because an individual electron is like a bar magnet with a north pole and a south pole that can point either up or down. The direction of the pole–whether the north pole is pointing up or down, for instance–is known as the electron’s magnetic moment or quantum state. If certain kinds of particles have the same magnetic moment, they cannot be in the same place at the same time. That is, two electrons in the same quantum state cannot sit on top of each other. “This is one of the main reasons something like a penny, which is made out of metal, doesn’t collapse on itself,” Nichol says. “The electrons are pushing themselves apart because they cannot be in the same place at the same time.” If two electrons are in opposite states, they can sit on top of each other. A surprising consequence of this is that if the electrons are close enough, their states will swap back and forth in time. “If you have one electron that’s up and another electron that’s down and you push them together for just the right amount of time, they will swap,” Nichol says. “They did not switch places, but their states switched.” To force this phenomenon, Nichol and his colleagues cooled down a semiconductor chip to extremely low temperatures. Using quantum dots–nanoscale semiconductors–they trapped four electrons in a row, then moved the electrons so they came in contact and their states switched. “There’s an easy way to switch the state between two neighboring electrons, but doing it over long distances–in our case, it’s four electrons–requires a lot of control and technical skill,” Nichol says. “Our research shows this is now a viable approach to send information over long distances.” A FIRST STEP Transmitting the state of an electron back and forth across an array of qubits, without moving the position of electrons, provides a striking example of the possibilities allowed by quantum physics for information science. “This experiment demonstrates that information in quantum states can be transferred without actually transferring the individual electron spins down the chain,” says Michael Manfra, a professor of physics and astronomy at Purdue University. “It is an important step for showing how information can be transmitted quantum-mechanically–in manners quite different than our classical intuition would lead us to believe.” Nichol likens this to the steps that led from the first computing devices to today’s computers. That said, will we all someday have quantum computers to replace our desktop computers? “If you had asked that question of IBM in the 1960s, they probably would’ve said no, there’s no way that’s going to happen,” Nichol says. “That’s my reaction now. But, who knows?”
<urn:uuid:14cc50c2-f63d-4342-abf0-d1ffc9bdd71f>
CC-MAIN-2022-05
https://bioengineer.org/new-research-brings-scientists-one-step-closer-to-a-fully-functioning-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301730.31/warc/CC-MAIN-20220120065949-20220120095949-00096.warc.gz
en
0.93675
1,452
4.03125
4
Physically, but not socially, isolated: Insights from a small Micronesian island (Eileen Tipoe, Economics) The small Micronesian island of Yap is both geographically and economically isolated. Historically, the Yapese did engage in trade with other islanders, but it happened infrequently and Yap did not have close cultural links with its trading partners. Under this near-isolation, around two centuries ago the Yapese developed an innovative exchange system that comes surprisingly close to how money is used today. While the Yapese used pearl shells as everyday currency, the island lacked the materials that people could use to make coins of greater value, such as durable rock or precious metals. So, the Yapese travelled hundreds of kilometres across the sea to Palau, where they carved large limestone discs (called rai) from quarries. These discs varied in size, from a few inches to twelve feet in diameter, and the largest required many men to lift. To make the discs easier to transport, the Yapese carved a hole into the middle of each disc so a rope or wooden pole could be used to carry it. The quarrying and transport of rai was an important economic activity. According to one estimate, in the late 19th century, over 10% of adult men were involved in these expeditions (Bryan, 2004). Village chiefs acted like today’s central bankers, controlling the number of expeditions and quantity of rai in circulation. Village chiefs were also responsible for determining the value of each rai (in terms of pearl shells), which not only depended on its size but also the difficulty involved with obtaining it. For example, rai that involved greater risks when quarrying or were cut using shell tools and transported by canoes were valued more highly than similarly-sized rai that were cut using iron tools or transported by Western ships. Rai that were very costly to obtain were even given names, such as the village chief’s name or the name of the canoe that transported it. The value of each rai was never written down, but was instead kept in collective memory, passed down from generation to generation by tribal elders. Each rai has its own unique story that is also part of the oral history, detailing the relationships and transactions involving it. These stories help rai retain their value even once they become old or broken, making them worth more than comparably-sized newer rai. This common knowledge also makes it impossible for villagers to steal or counterfeit rai, and so large rai that are difficult to move can be publicly displayed around the island, sometimes in a symbolic location, or in “stone money banks” in the village centre. Rai were exclusively traded within Yap, and the biggest rai were mainly used for large transactions, such as to purchase a plantation, and for conceptual exchanges (celebrating an event or recognising a favour). Like electronic money in modern economies, transactions involving rai do not require physical movement of the currency, only the communication that the currency’s ownership has been transferred. Even though Yap adopted the US dollar as its official currency in 1986, rai are still used in this way today, and continue to be passed down within families from generation to generation. While Yap’s system of exchange has vastly different features from modern financial systems, both share a common element: trust. Without credibility, social interactions could not take place. And even the most geographically-isolated communities recognise the importance of social interactions. Each individual does not function in isolation, as a self-sustaining “Robinson Crusoe economy”, but instead relies on and benefits from interactions with others. The social distancing measures due to the COVID-19 pandemic have shown just how much we value these interactions, and the worth of a social interaction cannot be entirely measured in terms of money. Economics is not only about money, it is also about the people who use it and the way they interact. How the economy, the society it is embedded in, and institutions that govern it function in an interrelated manner is what economics seeks to understand. Bryan, M. F. (2004). Island money. Federal Reserve Bank of Cleveland. Gillilliand, C.L.C. (1975). “The stone money of Yap: A numismatic survey,” Smithsonian Studies in History and Technology, Number 23. Washington, D.C.: Smithsonian Institution Press. Goldberg, D. (2005). Famous myths of “fiat money”. Journal of Money, Credit and Banking, 957-967. Poole, R. M. (2018). “The tiny island with human-sized money”. BBC, 3 May. Accessible at http://www.bbc.com/travel/story/20180502-the-tiny-island-with-human-sized-money Find out more about Eileen Tipoe here. Mansfield Isolation Conversation - 3rd and 4th Century Social Distancing in the Desert (Jenn Strawbridge, Theology) - Avoiding an Empty Universe with Solitary Neutrinos (Steve Biller, Physics) - Daniel Defoe's Journal of the Plague Year (Ros Ballaster, English) - Doing Community in Isolation: Mosques, Mecca and One Direction - Isolation and Revelation (Alison Salvesen, Oriental Studies) - Magnets in isolation (Stephen Blundell, Physics) - Oscar Wilde in prison (Michèle Mendelssohn, English) - Physically, but not socially, isolated: Insights from a small Micronesian island - Power and politics amidst COVID-19 seclusions—perspectives from geography (Amber Murrey, Geography) - Samuel Taylor Coleridge’s ‘Fears in Solitude’ (Ruth Scobie, English) - Social Distancing in Ancrene Wisse (Lucinda Rumsey, English) - Social distancing and quantum computing – are we all qubits now? (Jason Smith, Materials Science) - Thomas Nashe: ‘Plague’s Prisoner’ (Chris Salamone, English) - Even buildings need isolation (Sinan Acikgoz, Engineering)
<urn:uuid:09da97cf-9314-4e3e-bcc9-f4a3f320cde1>
CC-MAIN-2022-05
https://www.mansfield.ox.ac.uk/physically-not-socially-isolated-insights-small-micronesian-island-eileen-tipoe-economics
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301863.7/warc/CC-MAIN-20220120130236-20220120160236-00658.warc.gz
en
0.944989
1,329
3.75
4
In the last post, we described what qubits are and how quantum computing involves the manipulation of these qubits to perform useful calculations. In this post, we’ll abstract away from the details of the physics of qubits and just call the two observable states |0⟩ and |1⟩, rather than |ON⟩ and |OFF⟩. This will be useful for ultimately describing quantum algorithms. But before we get there, we need to take a few more steps into the details of quantum gates. Recap: the general description of a qubit is |Ψ⟩ = α|0⟩ + β|1⟩, where α and β are called amplitudes, and |α|² and |β|² are the probabilities of observing the system in the state |0⟩ and |1⟩, respectively. We can also express the states of qubits as vectors, like so: Quantum gates are transformations from quantum states to other quantum states. We can express these transformations as matrices, which when applied to state vectors yield new state vectors. Here’s a simple example of a quantum gate called the X gate: Applied to the states |0⟩ and |1⟩, this gate yields Applied to any general state, this gate yields: Another gate that is used all the time is the Hadamard gate, or H gate. Let’s see what it does to the |0⟩ and |1⟩ states: In words, H puts ordinary states into superposition. Superposition is the key to quantum computing. Without it, all we have is a fancier way of talking about classical computing. So it should make sense that H is a very useful gate. One more note on H: When you apply it to a state twice, you get back the state you started with. A simple proof of this comes by just multiplying the H matrix by itself: Okay, enough with single qubits. While they’re pretty cool as far as they go, any non-trivial quantum algorithm is going to involve multiple qubits. It turns out that everything we’ve said so far generalizes quite nicely. If we have two qubits, we describe the combined system by smushing them together with what’s called a tensor product (denoted ⊗). What this ends up looking like is the following: The first number refers to the state of the first qubit, and the second refers to the state of the second. Let’s smush together two arbitrary qubits: This is pretty much exactly what we should have expected combining qubit states would look like. The amplitude for the combined state to be |00⟩ is just the product of the amplitude for the first qubit to be |0⟩ and the second to be |0⟩. The amplitude for the combined state to be |01⟩ is just the product of the amplitude for the first qubit to be |0⟩ and second to be |1⟩. And so on. We can write a general two qubit state as a vector with four components. And as you might expect by now, two-qubit gates are simply 4 by 4 matrices that act on such vectors to produce new vectors. For instance, we can calculate the 4×4 matrix corresponding to the action of a Hadamard gate on both qubits: Why the two-qubit Hadamard gate has this exact form is a little beyond the scope of this post. Suffice it to say that this is the 4×4 matrix that successfully transforms two qubits as if they had each been put through a single-qubit Hadamard gate. (You can verify this for yourself by simply applying H to each qubit individually and then smushing them together in the way we described above.) Here’s what the two-qubit Hadamard gate does to the four basic two-qubit states. Here’s a visual representation of this transformation using bar graphs: We can easily extend this further to three, four, or more qubits. The state vector describing a N-qubit system must consider the amplitude for all possible combinations of 0s and 1s for each qubit. There are 2ᴺ such combinations (starting at 00…0 and ending at 11…1). So the vector describing an N-qubit system is composed of 2ᴺ complex numbers. If you’ve followed everything so far, then we are now ready to move on to some actual quantum algorithms! In the next post, we’ll see first how qubits can be used to solve problems that classical bits cannot, and then why quantum computers have this enhanced problem-solving ability. Next: Deutsch-Josza Algorithm
<urn:uuid:9a27c6e0-42b2-47e7-85c3-8c800018042a>
CC-MAIN-2022-05
https://risingentropy.com/more-on-quantum-gates/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320301063.81/warc/CC-MAIN-20220118213028-20220119003028-00016.warc.gz
en
0.930639
1,036
3.90625
4
If we provide a bog-standard computer with an infinite data store, it suddenly becomes a Turing machine: capable of answering any question answerable by any digital computer. Even quantum computers are no more powerful; they are merely faster. For example, a quantum computer recently factorised 15 into 5 * 3 by using Shor’s algorithm; an ordinary computer could do this much more slowly by exhaustive search. Your computer isn’t even as powerful as a Turing machine. Having a mere finite data store, it falls into a weaker class of machines: linear-bounded automata (or LBAs). A linear bounded automaton is just a Turing machine with a large but finite amount of tape. Since we’re so familiar with digital computers, I’ll give examples of other, more unusual, LBAs. Essentially, to make a LBA we just need to be able to make some simple components and connect them together. I’ll mention a few of the conventional and unconventional approaches here. Approach 1: Logic gates This is quite an abstract one. We use Boolean logic operations such as AND, OR and NOT, and have simple components performing each task. For example, the AND gate returns a ‘1’ if both inputs are ‘1’, and ‘0’ otherwise. A better logic gate is negative-AND, or ‘NAND’. This inverts the output, so it returns a ‘0’ if both inputs are ‘1’, and ‘1’ otherwise. By connecting up arrangements of NAND gates, we can emulate any other logic circuit. We can even avoid the need for wire crossings by a strategic arrangement of three XOR gates, each composed of four NAND gates: The inputs are on the left; the outputs are on the right. Approach 2: Electronic circuits A conventional way to build computers is to make these logic gates out of electronic components. We currently use CMOS (complementary metal-oxide semiconductor), although earlier versions used conventional transistors and even vacuum tubes. A few years ago I visited Manchester and saw the first stored-program computer. The Baby was built by a team of pioneers, including our favourite, Alan Turing. If you look closely, you’ll see diodes and some larger vacuum tubes called pentodes. No prizes for analysing the Greek stem and ascertaining how many electrodes a pentode has. Of course, there have been earlier computers, including Charles Babbage’s analytical engine, which has yet to be physically constructed. The first computer program (which calculated the Bernoulli numbers) was written by Ada Byron, wife of Lord Byron, who happened to go to the same college as Charles Babbage, namely Trinity College, Cambridge. The web of fundamentally interconnected events never ceases to amaze… You may believe that our silicon chips represent the most advanced possible electronic circuits, but this is not the case. Transistors have been built from carbon nanotubes, relying on the fact that they can either act as conductors or semiconductors. These nanotube circuits make silicon chips look like vacuum tubes by comparison! Approach 3: Making logic gates out of sticky mess Professor Andrew Adamatzky has created logic gates in amazing, unconventional ways. Reaction-diffusion systems (invented by Alan Turing) occur in certain chemical soups, such as the Belousov-Zhabotinsky reaction. This oscillates in colour violently, and causes spirals to emerge. You can investigate these sorts of patterns in the program Ready. Adamatsky has made similar logic gates using the Plasmodium slime mould. This is a gooey cluster of cells in a much longer life cycle, involving sporangia (like fungi), spores, amoeboids, gametes (some of which have flagellae, like sperm) and zygotes. Approach 4: Reversible logic A key feature of the NAND gate is that we can’t reverse it: if a ‘1’ is outputted, the inputs could have either been both ‘0’, or one ‘1’ and one ‘0’. In other words, information is continually lost. An alternative is to use only reversible logic gates. My favourite is the Fredkin gate, which has three inputs (A, B, C) and three outputs (A’, B’, C’). The output C’ is an identical copy of C. The inputs A and B are mapped directly to A’ and B’ (if C is ‘0’) or swapped to B’ and A’ (if C is ‘1’). As such, it is occasionally known as a controlled-swap gate. The Fredkin gate has three desirable properties: - Reversibility: given A’, B’ and C’, we can deduce A, B and C. Indeed, the gate is an involution, which means it is equal to its own inverse. - Conservation: A + B + C = A’ + B’ + C’, which means we can represent the pulses as solid particles. - Universality: any logic gate can be emulated with Fredkin gates. The idea led to the concept of a billiard-ball computer, where ‘1’ is represented by a billiard ball and ‘0’ by empty space. Computation is achieved by the balls undergoing elastic collisions with each other. Exactly what the ‘balls’ are is left to your imagination; they could be regular balls, ionised hydrogen atoms, or even planets. Andy Adamatzky has managed to do this with soldier crabs chased by images of predatory birds! Reversible logic can also be implemented electronically. It is of interest as it can be engineered to produce very little heat, which is the main enemy to the continual miniaturisation of silicon chips. Quantum computers also use reversible logic gates, but they can exist in a complex superposition of states rather than just one. To emulate a quantum computer on a classical computer, an exponentially large amount of memory is required. For example, a 30-qubit quantum computer requires more than 1000000000 complex numbers to describe its state. Approach 5: Trains and cellular automata Instead of logic gates, we can have a single pulse moving around a network of components, storing and reading information. A fun implementation of this is a train on a railway track. It transpires that only three types of points are needed, known as the lazy point, sprung point and flip-flop point. Ian Stewart has a brilliant write-up of this. I recall that someone even implemented a linear-bounded automaton on the Chalcraft-Greene train set, which was emulated by the Wireworld cellular automaton. There’s an even simpler cellular automaton, Banks-I, which is able to emulate any linear bounded automaton. The next state of each cell in Banks-I is determined only by that of itself and its four immediate neighbours. Moreover, there are only two states, and the rules are isotropic (reflecting and/or rotating a computer will not affect its operation). Again, this could have practical applications. There’s an emerging field of ‘quantum-dot cellular automata’, where each cell is a mere 60nm wide. This is superior to existing semiconductor-based circuitry. What isn’t a linear-bounded automaton? A Turing machine with infinite memory is more powerful than a linear-bounded automaton. Conversely, if the logic gates ‘burn out’ after a single use, like those in Matt Parker’s domino computer, it is less powerful than a LBA. If something can be ‘solved’ in an amount of time linear in the size of the machine, like the Antikythera mechanism, then it is also weaker than a linear-bounded automaton.
<urn:uuid:1033f30a-f763-46d0-990b-5ac546789707>
CC-MAIN-2022-05
https://cp4space.hatsya.com/2012/09/14/linear-bounded-automata/
s3://commoncrawl/crawl-data/CC-MAIN-2022-05/segments/1642320304876.16/warc/CC-MAIN-20220125220353-20220126010353-00416.warc.gz
en
0.937257
1,695
4
4