text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
Research team develops tiny low-energy device to rapidly reroute light in computer chips Researchers at the National Institute of Standards and Technology (NIST) and their colleagues have developed an optical switch that routes light from one computer chip to another in just 20 billionths of a second—faster than any other similar device. The compact switch is the first to operate at voltages low enough to be integrated onto low-cost silicon chips and redirects light with very low signal loss. The switch's record-breaking performance is a major new step toward building a computer that uses light instead of electricity to process information. Relying on particles of light—photons—to transport data within a computer offers several advantages over electronic communications. Photons travel faster than electrons and don't waste energy by heating up the computer components. Managing that waste heat is a major barrier to improving computer performance. Light signals have been used for decades to transmit information over great distances using optical fibers, but the fibers take up too much room to be used to carry data across a computer chip. The new switch combines nanometer-scale gold and silicon optical, electrical and mechanical components, all densely packed, to channel light into and out of a miniature racetrack, alter its speed, and change its direction of travel. (One nanometer is a billionth of a meter, or about one-hundred-thousandth the width of a human hair.) The NIST-led international team describes the device online today in Science. The device has myriad applications, notes study co-author Christian Haffner of NIST, ETH Zurich and the University of Maryland. In driverless cars, the switch could rapidly redirect a single light beam that must continually scan all parts of the roadway to measure the distance to other automobiles and pedestrians. The device could also make it easier to use more powerful light-based circuits instead of electricity-based ones in neural networks. These are artificial intelligence systems that simulate how neurons in the human brain make decisions about such complex tasks as pattern recognition and risk management. The new technology also uses very little energy to redirect light signals. This feature may help realize the dream of quantum computing. A quantum computer processes data stored in the subtle interrelations between specially prepared pairs of subatomic particles. However, these relationships are extremely fragile, requiring that a computer operate at ultralow temperatures and low power so that the particle pairs are disturbed as little as possible. Because the new optical switch requires little energy—unlike previous optical switches—it could become an integral part of a quantum computer. Haffner and his colleagues, who include Vladimir Aksyuk and Henri Lezec of NIST, say their findings may come as a surprise to many in the scientific community because the results contradict long-held beliefs. Some researchers have thought that opto-electro-mechanical switches would not be practical because they would be bulky, operate too slowly and require voltages too high for the components of a computer chip to tolerate. The switch exploits the wave nature of light. When two identical light waves meet, they can superpose such that the crest of one wave aligns or reinforces the crest of the other, creating a bright pattern known as constructive interference. The two waves may also be exactly out of step, so that the valley of one wave cancels the crest of the other, resulting in a dark pattern—destructive interference. In the team's setup, a light beam is confined to travel inside a miniature highway—a tube-shaped channel known as a waveguide. This linear highway is designed so that it has an off-ramp—some of the light can exit into a racetrack-shaped cavity, just a few nanometers away, etched into a silicon disk. If the light has just the right wavelength, it can whip around the racetrack many times before leaving the silicon cavity. The switch has one other crucial component: a thin gold membrane suspended just a few tens of nanometers above the silicon disk. Some of the light traveling in the silicon racetrack leaks out and strikes the membrane, inducing groups of electrons on the membrane's surface to oscillate. These oscillations, known as plasmons, are a kind of hybrid between a light wave and an electron wave: The oscillating electrons resemble the incoming light wave in that they vibrate at the same frequency, but they have a much shorter wavelength. The shorter wavelength lets researchers manipulate the plasmons over nanoscale distances, much shorter than the length of the original light wave, before converting the oscillations back into light. This, in turn, allows the optical switch to remain extremely compact. By changing the width of the gap between the silicon disk and the gold membrane by only a few nanometers, the researchers could delay or advance the phase of the hybrid light wave—the point in time when the wave reaches a crest or valley. Even minuscule variations in the width of the gap, which the team accomplished by electrostatically bending the gold membrane, dramatically altered the phase. Depending on how much the team had advanced or delayed the phase of the wave, when it recombined with light still traveling in the tube-shaped highway, the two beams interfered either constructively or destructively (see animation). If the light beams match up to interfere constructively, the light will continue in its original direction, traveling down the tube. But if the light beams interfere destructively, canceling each other out, that pathway is blocked. Instead, the light must move in another direction, determined by the orientation of other waveguides, or routes, placed close to the blocked pathway. In this way, the light can be switched at will to any of hundreds of other computer chips. Scientists had once thought that a plasmonic system would greatly attenuate light signals because photons would penetrate the interior of the gold membrane, where electrons would absorb much of the light energy. But the researchers have now proved that assumption wrong. The compactness of the device and a design that ensured that few photons would penetrate the membrane resulted in a loss of just 2.5% of the light signal, compared with 60% with previous switches. That puts the switch, although still a prototype, within reach of commercial applications. The team is now working to make the device even smaller by shortening the distance between the silicon disk and the gold membrane. This would further reduce signal loss, making the technology even more appealing to industry.
<urn:uuid:7160fd01-b93e-463b-a7f3-2e496748bd17>
CC-MAIN-2022-33
https://phys.org/news/2019-11-team-tiny-low-energy-device-rapidly.html?utm_source=nwletter&utm_medium=email&utm_campaign=daily-nwletter
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571869.23/warc/CC-MAIN-20220813021048-20220813051048-00448.warc.gz
en
0.93372
1,320
3.65625
4
A quantum bit, or qubit, is that the elementary unit of data for a quantum computer almost likes a touch in normal machines. A qubit may be a two-state (or two-level) quantum-mechanical system, one among the only quantum systems displaying the peculiarity of quantum physics. Examples include the spin of the electron during which the 2 levels are often taken as spin up and spin down; or the polarization of one photon during which the two states are often taken to be the vertical polarization and therefore the horizontal polarization. A quantum bit, or qubit, has two quantum states, analogous to the classical binary states. While the qubit are often in either state, it also can exist during a “superposition” of the 2.These states are often represented in so-called Dirac notation, where the state’s label is written between a | and a ⟩. Thus, a qubit’s two component, or “basis,” states are generally written as | 0⟩ and | 1⟩. Any given qubit wave function could also be written as a linear combination of the 2 states, each with its own complex coefficient ai: | ψ⟩ = a0 | 0⟩+ a1 | 1⟩. Since the probability of reading a state is proportional to the square of its coefficient’s magnitude, | a0 | 2 corresponds to the probability of detecting the state | 0⟩, and | a1 | 2 to the probability of detecting | 1⟩. The sum of the possibilities of every possible output state must be 100 percent, mathematically expressed during this case as | a0 | 2 + | a1 | 2 = 1. Bit versus qubit Though a classical bit is entirely specified either as 1 or 0, a qubit is specified by the continuum of the values a0 or a1, which are literally analog—that is, the relative contribution from each possible state are often any value between zero and one, provided the entire probability is one. Of course, this richness exists before the qubit’s state is measured, or “read out.” The results of a measurement looks a bit like a classical bit, a 0 or a 1, with the associated probability of getting each value proportional to the square of absolutely the value of the coefficient of the corresponding state, | a0| 2 or | a1| 2. A digit, characterized as 0 or 1, is employed to represent information in classical computers. When averaged over both of its states (0,1), a digit can represent up to at least one little bit of Shannon information, where a touch is that the basic unit of data. However, during this article, the word bit is synonymous with a digit. In classical computer technologies, a processed bit is implemented by one among two levels of low DC voltage, and whilst switching from one among these two levels to the opposite, a so-called “forbidden zone” between two logic levels must be passed as fast as possible, as electrical voltage cannot change from one level to a different instantaneously. There are two possible outcomes for the measurement of a qubit—usually taken to possess the worth “0” and “1”, sort of a bit or digit. However, whereas the state of a touch can only be either 0 or 1, the overall state of a qubit consistent with quantum physics are often a coherent superposition of together. Moreover, whereas a measurement of a classical bit wouldn’t disturb its state, a measurement of a qubit would destroy its coherence and irrevocably disturb the superposition state. It’s possible to completely encode one bit in one qubit. However, a qubit can hold more information, e.g., up to 2 bits using super dense coding. For a system of n components, an entire description of its state in classical physics requires only n bits, whereas in physics it requires (2n – 1) complex numbers. Operations on qubits There are various sorts of physical operations that will be performed on qubits. Quantum logic gates, building blocks for a quantum circuit during a quantum computer, operate a group of qubits (a register); mathematically, the qubits undergo a (reversible) unitary transformation described by the quantum gates’ unitary matrix. Quantum measurement is an irreversible operation during which information is gained about the state of one qubit (and coherence is lost). The results of the measurement of one qubit with the state ψ = α | 0 ⟩ + β | 1 ⟩ are going to be either | 0 (with probability | α | 2 (with probability | β |. Measurement of the state of the qubit alters the magnitudes of α and β. as an example, if the results of the measurement is | 1 ⟩ α is modified to 0 and β is modified to the phase factor e i ϕ not experimentally accessible. When a qubit is measured, the superposition state collapses to a basis state (up to a phase) and therefore the relative phase is rendered inaccessible (i.e., coherence is lost). Note that a measurement of a qubit state that’s entangled with another quantum system transforms the qubit state, a pure state, into a mixed state (an incoherent mixture of pure states) because the relative phase of the qubit state is rendered inaccessible. This operation collapses the quantum state (exactly like with measurement), which can successively if the qubit is entangled, collapse the state of other qubits. Initialization to | 0 ⟩ could also be implemented logically or physically: Logically as a measurement, followed by the appliance of the Pauli-X gate if the result from the measurement was | 1 ⟩.Physically, for instance if it’s a superconducting phase qubit, by lowering the energy of the quantum system to its state.
<urn:uuid:3f8ef994-3f8c-460d-902c-9dec63fba4f8>
CC-MAIN-2022-33
https://www.technologiesinindustry4.com/2021/07/what-is-the-quantum-bit-or-qubit.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573163.7/warc/CC-MAIN-20220818033705-20220818063705-00448.warc.gz
en
0.93987
1,259
3.96875
4
Technology allows us to communicate instantly with people in our neighborhoods or around the globe. This innovation not only keeps us connected but can help us live safer and healthier lives. Other ways technology is seen to have a positive effect on society include increased knowledge and understanding, improvements in industry and jobs and an interconnectedness of the world as a result of globalization. Technology affects almost every aspect of 21st century life, from transport efficiency and safety, to access to food and healthcare, socialization and productivity. The power of the internet has enabled global communities to form and ideas and resources to be shared more easily. Technology provides students with easy-to-access information, accelerated learning, and fun opportunities to practice what they learn. It enables students to explore new subjects and deepen their understanding of difficult concepts, particularly in STEM. Technology is constantly advancing. This gives rise to new jobs and industries, such as coding and artificial intelligence. Technology provides a makers education in AI, IT, design, and many STEM fields. … All of this is beneficial because it’s estimated that AI will replace 40 percent of jobs in the future. Technology has the ability to enhance daily living from appliances to mobile devices and computers, technology is everywhere. … In the rise of digital communication, technology can actually help communication skills because it allows people to learn written communication to varying audiences. Technology is one of the essential parts of our life which makes this world easier to live and give us more freedom and a lot of ways to live differently. … So, Technology has a significant role in our life especially in making a prosperous life, unlimited communication, and treatment of incurable diseases. A key positive impact of technology on education is that it brings students together through discussion and collaboration tools, who might never have considered, or had the opportunity, to communicate or help each other offline. It can provide empowerment, knowledge, awareness, access, and community. As we develop the technology of the future, we can work towards creating a better world long term. This means many different things as technology merges with all parts of our lives. In terms of classroom administration, for example, technology can provide enhanced record keeping, greatly improving the teacher’s analysis of student performance, especially the identification of skills which could be improved by deliberate practice. This is where technology can really help. “The overall survey results show that higher levels of technology use and technoference adds up to significantly less time spent together as a couple, less satisfaction and connection, and higher levels of depression and anxiety,” he said. In fact, some research indicates that technology can improve both the teaching and learning aspects of education. It also encourages active engagement and interactivity that students are so accustomed to outside of class, and miss when having to pay attention to lesson materials. Artificial Intelligence (AI) Artificial intelligence is probably the most important and ground-breaking trend in technology today. The fact that we have created machines and systems that can think for themselves is truly astounding, and the trend shows no signs of slowing down. Second: Technology provides teachers and students with access to a variety of educational resources that inspire creativity, critical thinking, communication, and collaboration. … This in turn promotes a global awareness, which is an essential component to a 21st century education. Potential benefits of technology for teens easily access information to inform and educate themselves. maintain and develop supportive relationships. form their identities (through self-expression, learning and talking) Technology helps relationships last over time and distance. For friends who can’t always meet in person, technology helps them stay connected. In the pre-digital days, Hampton explains, if you moved out of town for a new job or switched schools, it was a real challenge to stay in touch, no matter how close you were. Selecting the phone/tv/computer over your loved ones can put a lot of stress on the relationship, alienate affection, and leave your partner feeling unappreciated. Stretch this out over months or years, and it can lead to greater conflict, dissatisfaction, and possibly the end of the relationship. Technology plays a significant role in the way that young people communicate and develop friendships. The findings reveal that many children and young people are using a variety of online platforms on a daily basis to communicate with their friends, as well as to create new friendships and maintain existing ones. Experts have found that in addition to making our lives more convenient, but there’s a negative side to technology — it can be addicting and it can hurt our communication skills. Extended screen time can result in health ramifications like insomnia, eyestrain, and increased anxiety and depression. The Internet is the most important new technology which will solve all the major problems existing in the world including all major social issues such as high population rate, poverty, hunger, hygiene problems and much more by spreading awareness about all these major social issues. Whether it is clean energy, robotics, quantum computing, synthetic biology, telemedicine, AI, or cloud education and NUI software, it can solve all the biggest problems confronting mankind. Creating value means coming up with something people will pay for in the real world. Use digital resources well: Schools can use digital resources in a variety of ways to support teaching and learning. Electronic grade books, digital portfolios, learning games, and real-time feedback on teacher and student performance, are a few ways that technology can be utilized to power learning. Because they are at the center of the network of their families, Internet helps them to organize their lives. Also, it helps them to overcome their isolation, particularly in patriarchal societies. The Internet also contributes to the rise of the culture of autonomy. Technology Encourages Individual Learning – Technology personalizes the learning experience and provides greater opportunities for students with varying needs. They can learn at their own speed, go back to lessons and get online instructions to support the learning process.
<urn:uuid:54861be8-57c3-46c7-bffe-ab75e4e997ba>
CC-MAIN-2022-33
https://daitips.com/how-does-technology-help-the-world/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571056.58/warc/CC-MAIN-20220809155137-20220809185137-00051.warc.gz
en
0.945583
1,217
3.5
4
Last week, D-Wave announced a new version of its quantum annealing computer. The new machine includes a number of technical improvements, as well as a significant change to the physical arrangement of the board. What does all this mean? Combined with D-Wave's online resources, a tool that verges on useful is starting to take form. Making a smooth computer Before we reach the gooey chocolate center, we have to deal with the crusty outer coating: what is a quantum annealer? Most computers work in a straightforward manner: to add two numbers together, you construct a set of logical gates that will perform addition. Each of these gates performs a set of specific and clearly defined operations on its input. But that is not the only way to perform computation. Most problems can be rewritten so that they represent an energy minimization problem. In this picture, the problem is an energy landscape, and the solution is the lowest-possible energy of that landscape. The trick is finding the combination of bit values that represents that energy. To do this, we start with an energy landscape that is flat: we can start all the bits in the lowest energy of this flat landscape. Then we carefully and slowly modify the landscape around the bits until it represents our problem. If we have done that correctly, the bits are still in their lowest energy state. We obtain a solution by reading off the bit values. Although this works without anything quantum being involved, D-Wave does this with quantum bits (qubits). That means the qubits are correlated with each other—this is called quantum entanglement. As a result, they change value together, rather than independently. This allows something called quantum tunneling. Imagine a qubit stuck in a high energy state. Nearby, there is a lower energy state that the qubit would prefer to be in. But to get to that low energy state, it first has to go to an even higher energy state. In a classical system, this creates a barrier to reaching the lower energy state. But in a quantum system, the qubit can tunnel through the energy barrier to enter the lower energy state. These two properties may allow a computer like the one that D-Wave operates to obtain solutions for some problems more quickly than its classical counterpart. The devil, however, is in the details. Within the computer, an energy landscape is produced by the coupling (physical connection) among qubits. The coupling controls how strongly the value of one qubit influences the value of the rest of them. This has always been the major sticking point of the D-Wave machine. Under ideal circumstances, every qubit would have couplers that link it directly to every other qubit. That many connections, however, is impractical. A qubit all alone The consequences of the lack of connectivity are severe. Some problems simply cannot be represented by D-Wave machines. Even in cases where they can, the computation can be inefficient. Imagine that a problem required qubits one and three to be connected, but they are not directly connected. In that case, you have to search for qubits that are common to both. Say qubit one is linked to qubit five, while qubit two is linked to qubits five and three. Logical qubit one is then one and five combined. Logical qubit three is qubits two and three linked together. D-Wave refers to this as a chain length of, in this case, two. Chaining costs physical qubits, which are combined to create logical qubits, making fewer available for the computation. D-Wave's development path has been one of engineering ever more complicated arrangements of qubits to increase the connectivity. By increasing the connectivity, the chain lengths become shorter, leaving a larger number of logical qubits. When qubits are tied together to create more connectivity, a larger number of problems can be encoded. The efficiency of structuring some problems is going to be very, very low, meaning that the D-Wave architecture is simply not suited to those problems. But as the connectivity increases, the number of unsuitable problems goes down. In the previous iteration of this machine, the qubits were structured in blocks of eight, such that connectivity between diagonal blocks was improved compared to two versions ago (see the animated gif). This introduced a small improvement in chain lengths. Now D-Wave has moved on to a Pegasus graph. I don't know how to describe it, so I'm going to describe it incorrectly in the strict graph theory sense but in a way I think will make more sense overall. Instead of a single basic unit of eight qubits, there are now two basic units: a block of eight and a pair. In the eight qubit blocks, the qubits are arranged as before, with an inner loop and an outer loop. But, as you can see below, the inner and outer loops have an extra connection. That means that each qubit has five connections within that small block. The blocks are no longer arranged in a regular grid, either, and the interconnections between the qubits from separate blocks are much denser. Whereas the previous generation connected outer loop qubits to outer loop qubits, now each qubit is connected to both inner and outer loops of neighboring blocks. Then, on top of that, there is a new network of long-range connections between different blocks. Each qubit has a long-range connection to another qubit in a distant block. The density of the long-range connectivity is increased by the second basic building block: connected pairs. The pairs are placed around the outside of the main block pattern to complete the long-range connectivity. The idea, I think, is to ensure that the eight qubit groupings near the sides of the chip still have nearly the same connectivity as inner groups, unlike in the chimera graphs. Make the chains shorter What does all this mean? First of all, the similarity between the chimera and pegasus graphs means that code developed for chimera should still work on pegasus. The increased connectivity means the chain lengths are significantly reduced, making calculations more reliable. To give you an idea of how much the new graph improves the situation, a square lattice with diagonal interconnects requires a chain length of six in the chimera graph and chain length of two in the pegasus implementation. In general, chain lengths are reduced by a factor of two or more. The run times are reduced by 30 to 75 percent on the new machine. Aside from the new graph, D-Wave has improved at a technical level: the qubits have lower noise, and there is a much larger number of qubits. The plan is that the new architecture will eventually get D-Wave to 5,000 qubits (up from 2,000). Using the chimera architecture, this would be a nice (but not stellar) upgrade. Adding the changes in architecture means many more of those physical qubits can be used as independent logical qubits, making this a much more significant upgrade.
<urn:uuid:a9c84e87-0bce-48e9-bf88-0e15bee44222>
CC-MAIN-2022-33
https://arstechnica.com/science/2019/03/d-wave-introduces-new-architecture-that-can-scale-to-bigger-problems/?comments=1
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570741.21/warc/CC-MAIN-20220808001418-20220808031418-00451.warc.gz
en
0.952556
1,461
3.5
4
Every time you send an email, connect to your bank account or check your medical examination, you rely on random numbers to protect the security of your online activity. Cryptography is the set of tools we use to keep us safe online, and random numbers are the foundation in which cryptography is built upon. In other words, if we could not generate unpredictable random digits, secure online communications would not be possible. While there are many ways to generate “random numbers”, not all of them are good enough for cryptographic use. For instance, computers are unable to produce random digits on their own, unless we help them with external hardware means. The reason is simple: a computer is a machine designed to reliably execute one instruction after another, in a completely predictable and repeatable way. That said, computers have functions and instructions to generate so-called pseudo-random numbers (PRNGs), which produce sequences of digits with certain “random” statistical properties. But the random numbers produced from a PRNG are completely predictable and therefore cannot be used “as is” for cryptographic applications. The way to bring randomness (or unpredictability, to be more precise) to computers for cryptographic use is via so-called true random number generators (TRNGs). How do true random number generators (TRNGs) work? TRNGs are based on measuring a specific (random) physical process to produce random digits. Thus, the randomness of such numbers comes from the underlying physical process, which may indeed be completely unpredictable. TRNGs are the baseline for security applications. TRNGs are hardware components and sophisticated engineering is required to build them properly. Unfortunately, current communication systems rely on weak TRNG designs, compromising security and/or performance of the communications. There are mainly two reasons for this reliance on weak TRNG designs. First, some systems do not even have a dedicated TRNG hardware component, due to cost or design choice, thus relying on generic components in the system to produce random samples (e.g., clock interrupts from the operating system). Second, many TRNGs are designed based on physical principles that are complex and therefore produce “random-looking” dynamics (e.g., chaos), but which are, by principle, predictable and deterministic, which a sufficiently motivated attacker or a badly operated system may reveal to compromise security. Building reliable, fast and unpredictable TRNGs is essential for the present and future of cryptography. And Quantum technologies are now being used to produce quantum-enhanced TRNGs, that is How do quantum number generators work. What is a quantum random number generator? Quantum random number generators (QRNGs) are a special case of TRNG, that generate randomness by measuring quantum processes, which are, by nature non-deterministic. The advantages are multiple, including a fundamental advantage in using quantum indeterminacy, typically faster performances by leveraging photonics and most importantly, the ability to understand and verify the origin of unpredictability, which is a core assurance for the entire cybersecurity chain. Until now, engineering high-quality, scalable and fast quantum random number generators has been a challenge to date, and this is the area Quside has been pushing to advance over the last decade. Our proprietary technology allows for fast, high-quality, and scalable production, leading to a solution that is ready for today’s unpredictability concerns and tomorrow’s performance requirements. Fast and measurable random number solutions by Quside Quside has been researching, engineering and producing high-quality QRNGs for over a decade. The proprietary technology that Quside has put together provides 3 major advantages: Fast: Quside products can generate hundreds of Mb/s and even Gb/s already today. We leverage photonics to produce very fast random streams. Measurable: using our peer-reviewed Randomness Metrology methods, our customers can access transparently quality metrics that directly relate to the quantum physical principle responsible for unpredictability. Unpredictable: we use a largely peer-reviewed quantum process to generate randomness, thus harnessing nature to enhance entropy production. Additionally, Quside has also put a major effort on scaling the technology, which can be today produced at scale using photonic integrated chips (PICs). How are quantum random numbers generated? About Quside’s phase-diffusion technology, Quside QRNGs are based on the phase-diffusion process in semiconductor lasers. The core element of the technology is converting microscopic quantum observables, which are delicate and hard to measure, into macroscopic dynamics that are robust and easy to capture. To do this, we modulate a semiconductor laser from below to above its threshold level or produce a stream of phase randomized optical pulses. This is called gain-switching. Then, we use an interferometer to convert the phase fluctuations into the amplitude domain, generating a stream of amplitude-randomized optical pulses at the output (see refs [2, 3] for two examples of interferometers that we use). Finally, a fast photodiode converts the photonic signal into the electronic domain, where standard electronics are used for turning the analog signal into the digital realm. At the heart, the unpredictability of the phase-diffusion technology traces back to the process of spontaneous emission, which occurs as a result of the interaction between the quantum vacuum field and the laser’s gain medium. Quside’s technology exploits this quantum-mechanical process to produce quantum-based random numbers at multiple Gigabits per second. More about the Randomness metrology Testing randomness is a complex matter and the way it has been traditionally done is completely flawed. The question “how do you know it is random?” is a hard one to answer, and this is an area where we have been working since 2012, introducing our randomness metrology methodology in 2014 and collaborating with world-leading researchers from NIST, IQOQI and TU Delft to apply it in landmark experiments. Our methodology defines strict quality bounds on all our devices to capture the quality of the unpredictability we produce, and the best part is that we can confidently do it in a transparent manner. This boosts trust and confidence with our customers, who do not have to rely on black boxes anymore for producing their cryptographic material. In many traditional TRNGs, not based on quantum processes, it is extremely hard or even impossible to place rigorous quality bounds. As randomness is not emerging from a fundamentally random process. Quantum Random Number Generator solutions Start using fast and measurable quantum randomness with Quside. Securing communications is undeniably one of the most important endeavors of our society today. New cryptographic standards are now emerging, to enhance even further our protection and governments are releasing their mandates to transition the security of their networks and data, as the Quantum Computing Preparedness Cybersecurity Act by the US government on July 14th, 2022. Migrating to the new post-quantum standards with a hybrid security approach in mind is essential and the time to act is now and building a strong randomness generation foundation on which the new standards can rely upon is equally important. Remember that no security can be achieved unless we can produce unpredictable random numbers, and the question is: are we producing them? How do we now? Using the highest quality randomness generation technologies and monitoring them properly is where Quside can get you to the next level. Frequently Asked Question What is a quantum random number generator? It is a hardware component that is used to generate unpredictable random numbers, typically for cryptography or computation applications. How do quantum number generators work? A quantum random number generator (QRNG) generates streams of random digits by sampling a signal that contains sufficiently large quantum dynamics. Who has developed the quantum random number generator? There are various companies and research labs that have created and built QRNGs. Quside is a leading supplier of high-performance QRNGs. Why do we need QRNG? QRNGs provide several advantages to generate random numbers in applications as cryptography, including the strongest form of unpredictability, the ability to measure the quality through first principles and typically faster performance. Co-funder & CEO PhD in quantum technologies at ICFO, where he developed the quantum randomness technologies that were transferred to Quside. 10 years of experience in quantum and photonics technologies, co-inventor of multiple patent families and co-author of 15+ papers in top scientific journals. Received the award MIT Innovators Under 35 Europe. A research collaboration between Quside, ICFO, and others, has shown how using quantum random number generators provide the required quality and efficiency for safely running even the most complex stochastic simulations. Quantum random numbers for physics-inspired optimization problems Making decisions is commonly a challenge due to the uncertainty and overwhelming information needed to deal with a problem: from every engineering design, data analysis or most business decisions to...
<urn:uuid:f2bae6af-1f0d-44f2-bbe8-f4e4845a3c02>
CC-MAIN-2022-33
https://quside.com/quantum-random-number-generators-why-how-where/page/2/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571692.3/warc/CC-MAIN-20220812105810-20220812135810-00451.warc.gz
en
0.917492
1,882
3.953125
4
A quantum mechanics experiment performed by physicists at the University of California in Santa Barbara has been honored by the journal Science as its Breakthrough of the Year. The researchers’ work may shed light on just what actually gravity is, among other things. The team, led by Andrew Cleland, showed that a relatively large object’s reactions can be predicted by quantum mechanics theory. “The real impact of our experiment is more in the foundations of physics in the sense that it helps show quantum mechanics still applies to large objects,” Cleland told TechNewsWorld. “If you can do quantum mechanical experiments with objects that are big enough, you could see what effect gravity has on a quantum mechanical system.” Although gravitation is the weakest of four fundamental forces, or interactions, that make up every physical phenomenon, it’s has several unique features, one of them being that it has infinite range. About the Experiment Cleland’s team, which consisted of himself, fellow physicist John Martinis and doctoral student Aaron O’Connell, basically took a microwave frequency mechanical resonator and wired it to a superconducting qubit, then cooled the whole thing to near absolute zero and zapped it with a little energy to see what would happen. They then took this resonator and put it in a quantum superposition, a state in which it simultaneously had zero and one quantum of excitation. Energetically, this is the same as being in two places at the same time. A qubit is a bit of quantum information. Like a bit in computing, it can have two possible values — a 0 or a 1. Unlike a bit, it can be 0, 1 or both together, which is called a “superposition.” A superposition is a quantum mechanical property of a particle that lets it occupy all its possible quantum states simultaneously. The particle can be thought of as omnipresent in its superposition, if you like. A superconducting qubit results when you use nanofabricated superconducting electrodes coupled through Josephson junctions. A Josephson junction consists of a thin layer of non-superconducting material between two layers of superconducting material. Think of it as a ham sandwich without mayo, butter or condiments. Superconducting qubits go right through the non-superconducting material. Cleland’s team cooled its gadget to its lowest-energy state, in this case zero. This is called the “ground state.” “We got a dilution refrigerator; it’s a piece of commercial apparatus that anybody can buy for a couple of hundred thousand dollars,” Cleland said. “It’ll cool a few kilos of copper to about two hundredths of a degree above zero.” His team then cooled the resonator to its quantum ground state, then applied one quantum unit, or phonon, of energy. A phonon is a quantum mechanical description of a vibration in which a lattice uniformly oscillates at one frequency, known as the “normal mode.” Cleland’s team then measured the result with “classical equipment,” Cleland said. The resonator has a resonance frequency of 6 GHz, and the energy exchange rate was 100MHz, Cleland stated. The team had to do this repeatedly in order to get and verify its results. “One of the features of quantum mechanics is that, when you do a measurement, you destroy the state that was prepared,” Cleland explained. “We prepped the system, measured, then recorded the measurement on a state that was prepared identically thousands of times.” Possible Uses for the Discovery Cleland’s team made its discovery while it was trying to build a quantum computer. Quantum mechanics directly use quantum mechanical phenomena such as superposition and entanglement to work on data. Entanglement is a state in which two or more objects have their quantum states linked together so that you have to describe both and can’t describe either on its own. Cleland’s team also might use it in quantum communications, wherein quantum information is encoded into invisible light. Quantum information has no analog in standard information theory. The quantum nature of systems must be preserved in order to process information in a quantum computer or to distribute a secret key in quantum cryptography. Quantum communications might be used in teleportation. However, Cleland’s vision is a little more down-to-earth, in a sense — in the nearer future, the results of the experiment might help physicists better understand gravity. “Quantum mechanics works really well for small objects like atoms and electrons,” Cleland said. “But for large mechanical systems, there’s not been any good demonstrations, and there’s been this question as to whether quantum mechanics really applies to big mechanical things.” Even though the object used in the experiment was the size of a human hair, it was still “a trillion times bigger” than those used in previous experiments. It shows that the laws of quantum mechanics apply to relatively large objects. Widespread, practical application of this kind of research is still a long way off, Rob Enderle, principal analyst at the Enderle Group, told TechNewsWorld. “It’s to prove quantum theories and build up a base of knowledge so that more complex and more practical applications can be derived.” Eventually, products made using this knowledge might relate to near-instant communications over long distances, new sources of energy, and more efficient use of energy, Enderle stated. “We are at the stage where we’re looking to see if it’s possible to walk,” Enderle remarked. “Then we have to figure out how to walk; then actually walk. Running is the goal.” Short of a major breakthrough, we’re 20 to 50 years away from mass-market products based on advances in quantum mechanics, Enderle said.
<urn:uuid:ef54357b-f1c1-49b3-a817-8bd61952fd64>
CC-MAIN-2022-33
https://www.linuxinsider.com/story/2010s-breakthrough-of-the-year-brings-us-a-hair-closer-to-teleportation-71488.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572515.15/warc/CC-MAIN-20220816181215-20220816211215-00252.warc.gz
en
0.946597
1,279
3.609375
4
LTAT.00.014 3-6 ECTS Basqect* — Basics of Quantum Error Correction Classical communication and computation devices are prone to errors, which makes (classical) error correcting codes necessary. For quantum devices, that problem is much larger: Not only are quantum errors more pervasive, but correcting them is more subtle, as inspecting the quantum state in order to fix it is not easily possible without destroying it. While some low-hanging fruit in quantum communication and quantum computation can be reached on noisy quantum devices (e.g., simple Quantum Key Distribution in quantum communication, or simple simulations of condensed matter with quantum computing devices), for the second quantum revolution (arising from the ability to coherently manipulate quantum systems) to happen, quantum error must be corrected. Luckily, based on Nobel-prize worthy ideas of Peter Shor from the late 1990s, correcting quantum error and "undoing" decoherence (due to interaction of the quantum device with the rest of the universe) is possible, and can be understood and applied by the math-capable student. *) Pronounce like "basket" 🙄 Students with undergraduate degree in Math have additional options: Contact the instructors! The course will focus on the predominant proposal for quantum error correction, which is based on so-called Stabilizer Codes. As the target audience is computer science students, it is necessary, though, to go through some math first. At the end of the course, the successful student will have an understanding of stabilizer codes, and how to use them to correct Pauli quantum errors. - Math: Basics of group theory (normal subgroups, isomorphism theorems, group actions, centralizer, normalizer, etc) - Quantum: Groups of unitary operators, Pauli groups and Clifford groups - Math: Linear algebra over the field with 2 elements - Quantum: GF(2)-arithmetic for Pauli (sub-)groups - Quantum: Review of density operators and quantum channels - Quantum: Stabilizer codes and their stabilizer groups, dimension of the code space, logical qubits, logical operations, error syndromes and error correction - Examples: 5-qubit code, Steane 7-qubit code, 9-qubit Shor-code.... There are rumors according to which it might be possible to teach the subject matter more in a "physics style". The present course (designed and taught by mathematicians) is strictly mathematical, though, involving lots of yummy rigorous proofs. This is a "Book Course", i.e., students mostly learn independently, based on lecture notes handed out to them... ... But this is the first time that this "Book Course" is being taught, and we will have to figure out what the best organization is. Current plan: - The course takes place in weeks 1-8 of the semester - Two class meetings per week: One with the instructor (to learn from / discuss with them), and one without the instructor (for students to learn from / discuss with each other). - Homework (reviewing, reading, simple exercises, understanding) is not handed in / marked. - The lecture notes will be created in parallel with actual black-board lectures. - Pass-fail evaluation by exam (written? oral?) - Javier Gil Vidal (classes) - Assoc Prof Dirk Oliver Theis (design & content) At the end of this 3-ECTS course, you can correct quantum errors — which is amazing! In terms of quantum communication, it already gets you somewhere. The 6 ECTS reading course LTAT.00.0015 "Fatol Surf" (FAult TOLerance with SURFace codes) subsumes the content of LTAT.00.0014, and then, in the second half of the semester, moves to fault tolerant quantum computing based on surface codes (a special type of stabilizer codes) — the real deal. Fatol Surf is considerably more demanding though, than Basqect: It has no lectures, and the reading consists of a couple of research papers. The number of participants in Fatol Surf is limited; students specializing in quantum computing are preferred. The ultimate goal of Fatol Surf is: (1) to understand what really happens on a quantum computing device (e.g., quantum repeater, quantum computer) with error correction when a quantum algorithm is executed; and (2) to become able to experiment with quantum codes (surface or not) on quantum communication or computing devices, and so contribute to the quantum revolution. This course's page on Quantum Computing at the University of Tartu. ECTS and the relationship between Basqect and Fatol Surf - Basqect (LTAT.00.0014) keeps you busy busy for a half semester (= 3 ECTS). - Fatol Surf (LTAT.00.0015) keeps you busy for the full semester (= 6 ECTS), the first half of which is the content of Basqect.
<urn:uuid:262d9631-a50c-4435-b05a-7ad6d3199b26>
CC-MAIN-2022-33
https://courses.cs.ut.ee/2021/Basqect/fall
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571097.39/warc/CC-MAIN-20220810010059-20220810040059-00651.warc.gz
en
0.895224
1,067
3.5
4
In computing and telecommunications a bit is a basic unit of information storage and communication; it is the maximum amount of information that can be stored by a device or other physical system that can normally exist in only two distinct states. These states are often interpreted (especially in the storage of numerical data) as the s 0 and 1. They may be interpreted also as logical values, either "true" or "false"; or two settings of a flag or switch, either "on" or "off". In information theory, "one bit" is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable becomes known. In quantum computing, a quantum bit or qubit is a quantum system that can exist in superposition of two bit values, "true" and "false". The symbol for bit, as a unit of information, is "bit" or (lowercase) "b"; the latter being recommended by the IEEE 1541 Standard (2002). The encoding of data by discrete bits was used in the punched cards invented by Basile Bouchon and Jean-Baptiste Falcon (1725), developed by Joseph Marie Jacquard (1804), and later adopted by Semen Korsakov, Charles Babbage, Hermann Hollerith, and early computer manufacturers like IBM. Another variant of that idea was the perforated paper tape. In all those systems, the medium (card or tape) conceptually carried an array of hole positions; each position could be either punched through or not, thus potentially carrying one bit of information. The encoding of text by bits was also used in Morse code (1844) and early digital communications machines such as teletypes and stock ticker machines (1870). Ralph Hartley suggested the use of a logarithmic measure of information in 1928. Claude E. Shannon first used the word bit in his seminal 1948 paper A Mathematical Theory of Communication. He attributed its origin to John W. Tukey, who had written a Bell Labs memo on 9 January 1947 in which he contracted "binary digit" to simply "bit". Interestingly, Vannevar Bush had written in 1936 of "bits of information" that could be stored on the punch cards used in the mechanical computers of that time. The first programmable computer built by Konrad Zuse used binary notation for numbers, whose bits were realized as electrical relays which could be either "open" or "closed". Transmission and processing Bits can be implemented in many forms. In most modern computing devices, a bit is usually represented by an electrical voltage or current pulse, or by the electrical state of a flip-flop circuit. For devices using positive logic, a digit value of 1 (true value or high) is represented by a positive voltage relative to the electrical ground voltage (up to 5 volts in the case of TTL designs), while a digit value of 0 (false value or low) is represented by 0 volts. In semiconductor memory, such as dynamic random-access memory or flash memory, the two values of a bit may be represented by two levels of electrical charge stored in a capacitor. In programmable logic arrays and certain types of read-only memory, a bit may be respresented by the presence or absence of a conducting path at a certain point of a circuit. In magnetic storage devices such as magnetic tape, magnetic disc, or magnetic bubble memory, it may be represented by the polarity of magnetization of a certain area of a ferromagnetic film. In optical discs, a bit is encoded as the presence or absence of a microscopic pit on a reflective surface. Information capacity and information content Information capacity of a storage system is only an upper bound to the actual quantity of information stored therein. If the two possible values of one bit of storage are not equally likely, that bit of storage will contain less than one bit of information. Indeed, if the value is completely predictable, then the reading of that value will provide no information at all (zero bits). If a computer file that uses n bits of storage contains only m < n bits of information, then that information can in principle be encoded in about m bits, at least on the average. This principle is the basis of data compression technology. Sometimes the name bit is used when discussing data storage while shannon is used for the statistical bit. There are several units of information which are defined as multiples of bits, such as byte (8 bits), kilobit (either 1000 or 210 = 1024 bits), megabyte (either 8,000,000 or 8×220 = 8,388,608 bits), etc. Computers usually manipulate bits in groups of a fixed size, conventionally named "words". The number of bits in a word varies with the computer model; typically between 8 to 80 bits; or even more in some specialized machines. The International Electrotechnical Commission's standard IEC 60027 specifies that the symbol for bit should be "bit", and this should used in all multiples, such as "kbit" (for kilobit). However, the letter "b" (in lower case) is widely used too. The letter "B" (upper case) is both the standard and customary symbol for byte. In telecommunications (including computer networks), data transfer rates are usually measured in bits per second (bit/s) or its multiples, such as kbit/s. (This unit is not to be confused with baud.) When a bit within a group of bits such as a byte or word is to be referred to, it is usually specified by a number from 0 (not 1) upwards corresponding to its position within the byte or word. However, 0 can refer to either the most significant bit or to the least significant bit depending on the context, so the convention of use must be known. Certain bitwise computer processor instructions (such as bit set) operate at the level of manipulating bits rather than manipulating data interpreted as an aggregate of bits. Other information units Other units of information, sometimes used in information theory, include the natural digit also called a nat or nit and defined as log2 e (≈ 1.443) bits, where e is the base of the natural logarithms; and the decit, ban or Hartley, defined as log210 (≈ 3.322) bits.. Conversely, one bit of information corresponds to about ln 2 (≈ 0.693) nats, or log10 2 (≈ 0.301) Hartleys. Some authors also define a binit as an arbitrary information unit equivalent to some fixed but unspecified number of bits.) - Units of information - Integral data type - Primitive type - Information entropy - Binary arithmetic - Ternary numeral system - John B. Anderson, Rolf Johnnesson (2006) Understanding Information Transmission. - Simon Haykin (2006), Digital Communications - Norman Abramson (1963), Information theory and coding. McGraw-Hill. - Darwin among the machines: the evolution of global intelligence, George Dyson, 1997. ISBN 0-201-40649-7 - National Institute of Standards and Technology (2008), Guide for the Use of the International System of Units. Online version. - Amitabha Bhattacharya, Digital Communication
<urn:uuid:381a7ace-2d43-4dd3-9b9f-50c47468d435>
CC-MAIN-2022-33
https://wiki.gis.com/wiki/index.php/Bit
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573197.34/warc/CC-MAIN-20220818124424-20220818154424-00052.warc.gz
en
0.903915
1,573
3.5
4
What is Digital Computer : The digital computer is a type of electronic device that can process and store data electronically. It was first created in the 1940s and has since been used in many different ways. A digital computer is a machine that manipulates and processes data using ones and zeros. The ones and zeros are the basic building blocks of digital information. - What is a Digital Computer and What are its Key Components? - History of Digital Computers: From Vacuum Tubes to Transistors - Applications of Digital Computers: From Military to Business - Future of Digital Computers: Emerging Technologies and their Impact - What are Digital Computer and their Types? - Where are Digital Computers used? - Advantages and Disadvantages of Digital Computer What is a Digital Computer and What are its Key Components? Digital computers store information in bits, which are groups of one or more zeros and ones. Each bit can represent either a one (1) or a zero (0). A digital computer consists of a processor, memory, input/output devices, and a bus. The processor is responsible for executing the instructions stored in memory to perform operations such as data manipulation, calculation, and text processing. Input/output devices allow the computer to communicate with other devices outside of it. The bus allows multiple devices to interact with each other. History of Digital Computers: From Vacuum Tubes to Transistors Digital computers are ubiquitous in modern society. From the humble vacuum tube computer of the 1950s, to today’s digital devices like smartphones and tablets, the history of digital computers is a story of technological innovation and dramatic change. In this article, we’ll explore some of the key milestones in digital computer development, from vacuum tubes to transistors, and look at how these early machines paved the way for the modern age of computing. The digital computer is a technology that emerged in the 1940s and 1950s. The first programmable computers were developed at the start of World War II, but it was not until the 1960s that digital computers became commercially available. - The first commercial digital computer, the Ferranti Mark 1 was introduced in 1957. - The first mass-produced computer, the IBM System /360 , was introduced in 1964. - The first commercial personal computer, the IBM PC , went on sale in 1981. - The mass-produced personal computer was the Macintosh , released by Apple in 1984. - The first tablet computer was the Apple iPad , which began selling in 2010. - The first microprocessor was developed in 1971 by the Intel Corporation. In 1972, Intel released the 4004 , the first single-chip microprocessor that could be used to implement an entire computer system on a single integrated circuit. - Delta Computer Systems: A History of Innovation - The Fast Ways to Speed Up an HP Laptop in a Minute - Secure Exam Browser Keeps You Safe When Working Online - What is VGA Full Form in a Computer? - PDF Download Hardware and Software PDF Free eBook - Ada Lovelace: The Mother Of Computer Programming Applications of Digital Computers: From Military to Business As digital computers have become more ubiquitous in business and military settings, their applications have expanded to include a diverse range of fields. Here are Six examples: 1) Financial analysis: By tracking stock prices, businesses can make better decisions about when to sell stocks and invest in new ventures. Digital computers also help researchers predict patterns in financial data. 2) Manufacturing: With 3D printing technology becoming more affordable, businesses can create customized products on-demand. Computers are also used to monitor the production process and optimize efficiency. 3) Surveillance: Businesses use video surveillance systems to keep an eye on their customers and employees. Digital cameras and software can detect movements that could indicate criminal activity. 4) Medicine: Doctors use digital images to screen patients for diseases. They can also use diagnostic tools like CAT scans and MRI scans to measure brain tumors and other problems. 5) Tourism With the rise of social media, tourists use cameras to capture their experiences and share them with friends. This helps increase the popularity of a place and draw in more visitors. 6) Security Cameras are a vital part of security systems that keep dangerous criminals and intruders out. The Role Of Cameras In Surveillance Systems The role of digital cameras in surveillance systems gets you the highest quality pictures with excellent resolution. Future of Digital Computers: Emerging Technologies and their Impact Digital computers have been with us for almost 50 years and are still with us today. However, there are many emerging technologies that are poised to have a significant impact on digital computers in the future. These include quantum computing, neuromorphic computing, and artificial intelligence. It is still too early to say which of these technologies will ultimately dominate, but they all hold promise for making digital computers even more powerful and efficient. What are Digital Computer and their Types? Digital computers are devices that use electric signals to process data. There are three main types of digital computers: central processing units (CPUs), graphics processing units (GPUs), and embedded systems. CPUs are the most common type of digital computer, and they include the processors used in laptops, desktops, servers, and other devices. GPUs are designed for gaming and video rendering, but they can also be used to handle complex mathematical calculations. Embedded systems are tiny computer chips that can be found in everything from cars to smartwatches. Where are Digital Computers used? There are many places around the world where digital computers are used. Some common locations include research labs, factories, and offices. Advantages and Disadvantages of Digital Computer The advantages of digital computers are many. They are fast, efficient, and reliable. They can store large amounts of data, often making them the choice for businesses and organizations. Some disadvantages of digital computers include their high price tag and their reliance on electrical power. The digital computer has revolutionized the way we live and work. It has made our world more efficient, and it has allowed us to do things that we never thought possible. We can now interact with the world around us in ways that were once unimaginable, and we can do so without ever having to leave our homes. This technology is here to stay, and it will continue to evolve in ways that we can only imagine.
<urn:uuid:c6feee7b-ccb7-47a1-9329-c3df55808ae1>
CC-MAIN-2022-33
https://www.basiccomputerknowledge.in/what-is-digital-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571246.56/warc/CC-MAIN-20220811073058-20220811103058-00653.warc.gz
en
0.938355
1,322
3.6875
4
Images of the electron trap architecture. Top: Schematic representation of the experiment. Current of surface electrons, induced by ac voltage applied to the electrode underneath Reservoir 1, flows between Reservoirs 1 and 4, as shown by the red arrow. Middle: Cross section of the central microchannel around the gate area. Bottom: Photograph of the microchannel device on a copper sample cell, with subsequent close-up photographs of the central channel and surrounding reservoirs.Credit: Denis Konstantinov The future of quantum computing is a hot topic not only for experts but also in many commercial and governmental agencies. Rather than processing and storing information as bits in transistors or memories, which limit information to the binary ‘1’ or ‘0’, quantum computers would instead use quantum systems, such as atoms, ions, or electrons, as ‘qubits’ to process and store “quantum information” in, which can be in an infinite number of combinations of ‘1 and 0’. Large technology corporations, such as Google, Microsoft, Intel, and IBM are investing heavily in related projects that may lead to realize the quantum computer and technologies. At the same time, universities and research institutes around the world are researching novel quantum systems, adoptable for quantum computing. The Quantum Dynamics Unit at the Okinawa Institute of Science and Technology Graduate University (OIST), has recently made novel findings about electrons floating on the surface of liquid helium, a quantum system which may be a new candidate for quantum computing into reality. These results were published in Physical Review B. One of the common problems in quantum computing research using solids is that it is very difficult to make perfectly identical qubits because intrinsic defects or impurities in the materials used randomly affect each individual qubit performance. “Our motivation for pursuing a liquid helium system is that it is intrinsically pure and free of defects, which theoretically allows for the creation of perfectly identical qubits. Additionally, we can move electrons in this liquid helium system, which is difficult or nearly impossible in other quantum systems,” explained Prof. Denis Konstantinov, head of the Quantum Dynamics Unit. Therefore, it is believed that adopting this system for quantum computing might bring the whole field to the next level. Find your dream job in the space industry. Check our Space Job Board » Utilizing electrons on a liquid helium surface for quantum computing requires isolating individual electrons on a helium surface and controlling their quantum degrees of freedom, either motional or spin. It may also require the movement of electrons to different locations, thus it is also important to understand the physics of the interaction between electrons and the helium surface. It was previously discovered that electrons on helium can form a two-dimensional crystal, and some unique phenomena occur when this crystal moves along the helium surface, due to the interaction between electrons and surface waves. The OIST scientists, however, are the first to probe how these phenomena depend on the size of the electron crystal. To test this, Dr. Alexander Badrutdinov, Dr. Oleksandr Smorodin and OIST PhD student Jui-Yin Lin, built a microscopic channel device that contained an electron trap within to isolate a crystal of a relatively small number of electrons. This crystal would then be moved across the liquid helium surface by altering electrostatic potential of one of the device electrodes. This motion would be detected by measuring image charges, which are induced by the moving electrons, flowing through another electrode using a commercially available current amplifier and lock-in detector. “This research gave us some insights into the physics of the interaction between electrons and the helium surface, as well as expanded our micro-engineering capabilities” states Dr. Alexander Badrutdinov, a former member of the Quantum Dynamics Unit and the first author of the paper. “We successfully adopted a technology to confine electrons into microscopic devices, on the scale of few microns. With this technology we studied the motion of microscopic two-dimensional electron crystals along a liquid helium surface and saw no difference between the movement of large electron crystals, on the scale of millions to billions of electrons, and crystals as small as a few thousands of electrons, when theoretically, differences should exist.” This research is the first step at OIST in the prospect of using this system for quantum computing. According to Konstantinov, “the next step in this research is to isolate an even smaller electron crystal, and ultimately, a single electron, and to move them in this system. Unlike other systems, this system has the potential to be a pure, scalable system with mobile qubits.” In theory, this type of system would have the potential to revolutionize the quantum computing research field. A.O. Badrutdinov, A. V. Smorodin, D. G. Rees, J. Y. Lin, D. Konstantinov. Nonlinear transport of the inhomogeneous Wigner solid in a channel geometry. Physical Review B, 2016; 94 (19) DOI: 10.1103/PhysRevB.94.195311
<urn:uuid:4dfea570-ec69-4da1-8f47-9e6f4646f921>
CC-MAIN-2022-33
https://sciencebulletin.org/study-electron-movement-helium-may-impact-future-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571847.45/warc/CC-MAIN-20220812230927-20220813020927-00254.warc.gz
en
0.924293
1,056
3.5
4
What is Quantum Computing There is an international race to build a quantum computer that transcends the capacity of conventional computers and to build ultra-secure communication networks – a race that has been called the space race of the 21st century. These technologies have the potential to transform the information economy and create the industries of the future, solving in hours or minutes problems that would take conventional computers – even supercomputers – centuries, and tackling otherwise intractable problems that even supercomputers could not solve in a useful timeframe. Present-day computers are really fast, and they are getting very powerful, however they aren’t fast enough to perform all of the calculations that we need them to in a useful time frame. Quantum computers use quantum mechanics to perform certain complex calculations in a smaller number of steps than an ordinary computer. However, not all algorithms run faster on quantum hardware – only certain ones with particular features. Identifying exactly which problems can benefit from quantum computing is an active area of research today. Potential applications include machine learning, scheduling and logistical planning, financial analysis, stock market modelling, software and hardware verification, rapid drug design and testing, and early disease detection and prevention. A 2020 report from CSIRO revealed that quantum computing in Australia has the potential to create 10,000 jobs and A$2.5 billion in annual revenue by 2040, while spurring breakthroughs in drug development, industrial processes, and machine learning. A quantum computer is a machine that performs its calculations by harnessing the unique features of quantum mechanics. In ordinary computing, information is stored in bits, and each bit stores either a 0 or a 1. Many bits together can represent all sorts of information using binary code, which computers can process. Quantum computers process quantum information, which is stored in quantum bits, called qubits (pronounced “KYU-bits”). A qubit can be any quantum object with two states – for example, a single electron (spin up or spin down) or a single photon (polarised horizontally or vertically). In everyday life, we usually have a good intuition regarding how the physical world will behave. Drop a glass and it will smash on the floor. Punch a concrete wall and your fist won’t go through it. But in the world of the ultra-small – atoms and electrons – none of the normal rules apply. Instead particles follow quantum rules that are quite baffling. Like a bit, a qubit can be in one of its two states, labelled 0 or 1, but unlike a bit, a qubit can also be in a superposition of 0 and 1. Superposition is a subtle concept. Measuring a qubit always gives either 0 or 1, but superpositions can be manipulated beforehand so that one of the two outcomes is more likely. Multiple qubits together can be put into more complicated superpositions. Measuring the qubits always gives a binary string of 0s and 1s, but the likelihood of what string appears can be controlled beforehand, and this is what a quantum computer does. In fact, quantum computers work by first creating a superposition of lots of different possible solutions to a problem – encoded in qubits – and then manipulating that superposition so that wrong solutions cancel out and right ones are strengthened. This is because the alternatives in a superposition can interfere like waves do. This makes the right answer much more likely to appear when you measure the qubits. For certain types of problems, these two steps can be completed very quickly – outperforming any ordinary computer in solving the original problem. Building the quantum computer hardware that will work reliably, and is large enough, to process quantum information without errors is a big challenge. Worldwide there is a huge experimental effort to do just that. There are many different designs being explored to build a universal quantum computer – some of these include superconducting circuits, ion traps, optics, and silicon. In Australia, the Centre for Quantum Computation and Communication Technology (CQC²T) is a world leader in two of the most promising types of hardware for a quantum computer: optical qubits (made of light) and silicon qubits (made of either nuclear or electron spins). A large-scale universal quantum processor capable of outperforming today’s computers for a wide-range of useful applications needs to have millions of qubits and very few errors. Small-scale quantum computers called noisy intermediate-scale quantum (NISQ) processors already exist and can be accessed through the internet – i.e., through “cloud quantum computing.” These devices are currently relatively small in qubit number and error prone but are very important in pointing the way forward. To achieve commercial success, we require larger-scale quantum computers with error correction, and that is likely to take at least another 5 years and will continue to improve over the next decade. Quantum Communication technology has the potential to send messages securely against any sort of hacker, no matter how powerful their computer is – even a quantum computer! The basic idea is simple. Heisenberg’s Uncertainty Principle implies that if you find out one property of a particle you necessarily create uncertainty in other properties. That is, quantum particles are disturbed by measurements. Because of this, an eavesdropper trying to read a secret message encoded in photons will leave unmistakable traces of this transgression on the message itself. These traces clearly reveal the attempt to eavesdrop, ensuring detection before any of your valuable information is compromised. Quantum communication protocols were first developed in the 1980s. There are short-range systems in commercial operation in many countries, including an Australian one developed by CQC²T researchers. Recently ground-satellite quantum encryption links have also been demonstrated by scientists in USTC, China [Sheng-Kai Liao et al., ‘Satellite-to-ground quantum key distribution’, Nature, 2017, 549:43; Ji-Gang Ren et al., ‘Ground-to-satellite quantum teleportation, Nature, 2017, 549:70.) and MPL, Germany [K Günthner et al., ‘Quantum-limited measurements of optical signals from a geostationary satellite’, Optica, 2017, 4:611]. A grand challenge, which is being tackled worldwide, including at CQC²T, is to extend the range of secure communications into a global network. Because quantum messages cannot be copied, this requires using quantum repeaters to realise a large-scale quantum network. Analogous to the fibre repeater links in global fibre optics networks, quantum repeaters are special-purpose quantum devices that bridge a connection between a distant quantum source and receiver are critical infrastructure for a globally connected quantum network. Designing and making them – and showing their viability – is an active area of research.
<urn:uuid:b938dda5-f339-497a-b88d-c29a0022556f>
CC-MAIN-2022-33
https://www.cqc2t.org/education/what-is-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571472.69/warc/CC-MAIN-20220811133823-20220811163823-00055.warc.gz
en
0.920423
1,413
3.84375
4
Imagine a window with an image etched into its surface, but when you walk around the other side, the image is entirely different. - Nanoengineering manipulates the path light travels through a material - This allows two separate images to be seen when viewed from opposite sides - ‘Nonlinear optics’ could have applications in computing and lead to a faster internet Although it sounds impossible, that’s essentially what researchers at the Australian National University (ANU) have achieved, with tiny translucent slides that can show two separate images, at the same time, when viewed from opposite sides. . In one experiment, for example, scientists created a slide showing the Australian continent on one side and the Sydney Opera House on the other. The advance in the field known as “nonlinear optics” could have applications in photonic computing – using visible light or infrared instead of electric current to perform numerical calculations. These new light-based devices could eventually lead to a faster and cheaper internet, the researchers said. Their research was published today in Nature Photonics. How it works? As you may have noticed, light generally travels the same path forward and backward through a material like glass or water. To change that, the researchers created tiny glass slides coated with cylinder-shaped nanoparticles, each particle so small that 12,000 of them could fit in the cross-section of a human hair. Each cylinder controlled the flow of light like traffic signs directing traffic, said ANU physicist and co-author Sergey Kruk. “We were able to introduce an asymmetry in the way light travels,” he said. “So when the light propagates forwards and when it propagates backwards, we get completely different results.” The technical name for these “traffic signs” is “nonlinear dielectric resonators”. The cylinders were made of two layers of silicon and silicon nitride. Each layer had a different index of refraction – a measure of how fast light travels through a medium, and therefore the material’s light-bending ability. The different refractive indices of air and water, for example, explain why a spoon in a glass of water looks bent. These cylinders could be positioned to be “light” or “dark” only for the rear or front directions, or “light” or “dark” for the front and rear. By arranging these four types of cylinders into patterns, Dr. Kruk and his colleagues from China, Germany and Singapore were able to form images. “Basically, slides are made up of individual pixels,” Dr. Kruk said. “And we can put those pixels together in any pattern you like.” Benjamin Eggleton, director of the Sydney Nano Institute, called the research “significant” and a “fundamental finding”. “This is a heroic fundamental breakthrough,” said Professor Eggleton, who was not involved in the research. The most obvious application, he said, was “nano-photonic components” for computing. A The key element in electronic computing and the complex architecture of microchips is the diode which allows electrical current to flow in only one direction. In photonics, or light-based computing, a diode is called an isolator. The current harvest of Insulators are relatively large and complicated, but ANU’s research could lead to much smaller and simpler designs, Professor Eggleton said. Photonic circuits, or optical computing, have been dubbed the future of computing because they can be smaller than electronic circuits, operate at higher speeds, use less power, and generate less heat. “Many leading companies commercializing quantum computing technology rely on photonic circuits,” Professor Eggleton said. “And on those circuits, you’ll need those insulators.” Dr. Kruk has also seen applications in photonic circuits. This could ultimately lead to faster and cheaper internet, he said. Two years ago, for example, researchers built a clocked photonic circuit 44.2 terabits per second on 76 kilometers of optical fibers installed between two university campuses in Melbourne. By comparison, that’s around 1 million times faster than the average Australian broadband download speed. Physicists are just beginning to understand how intense light interacts with the structure of materials at the nanoscale, Dr Kruk said. “At this point in technological development, we’ve gotten incredibly good at controlling electric currents, and we’re not so good at controlling beams of light. “This [research] perhaps a first convincing step towards the establishment of a very sophisticated control of the traffic of the light beams. “[This is] similar to a very sophisticated control of the traffic of electric currents, which we began to establish perhaps in the middle of the 20th century.” Job , updated
<urn:uuid:1803c65a-0b8a-40d7-8a70-f3f1d6ce8e67>
CC-MAIN-2022-33
https://patent-dfmm.org/nanoparticles-that-control-the-flow-of-light-could-mean-a-faster-cheaper-internet/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571222.74/warc/CC-MAIN-20220810222056-20220811012056-00055.warc.gz
en
0.938991
1,057
3.953125
4
In quantum computing, the quantum Fourier transform (QFT) is a linear transformation on quantum bits, and is the quantum analogue of the discrete Fourier transform. The quantum Fourier transform is a part of many quantum algorithms, notably Shor's algorithm for factoring and computing the discrete logarithm, the quantum phase estimation algorithm for estimating the eigenvalues of a unitary operator, and algorithms for the hidden subgroup problem. The quantum Fourier transform was discovered by Don Coppersmith. The quantum Fourier transform can be performed efficiently on a quantum computer, with a particular decomposition into a product of simpler unitary matrices. Using a simple decomposition, the discrete Fourier transform on amplitudes can be implemented as a quantum circuit consisting of only Hadamard gates and controlled phase shift gates, where is the number of qubits. This can be compared with the classical discrete Fourier transform, which takes gates (where is the number of bits), which is exponentially more than . The quantum Fourier transform acts on a quantum state vector (a quantum register), and the classical Fourier transform acts on a vector. Both types of vectors can be written as lists of complex numbers, in the quantum case it is a sequence of probability amplitudes for the different outcomes upon measurement. Because measurement collapses the quantum state to a single value (called basis state, or eigenstate), not every task that uses the classical Fourier transform can take advantage of the quantum Fourier transform's exponential speedup. The best quantum Fourier transform algorithms known (as of late 2000) require only gates to achieve an efficient approximation. The quantum Fourier transform is the classical discrete Fourier transform applied to the vector of amplitudes of a quantum state, where we usually consider vectors of length . The classical Fourier transform acts on a vector and maps it to the vector according to the formula: where and is an N-th root of unity. Similarly, the quantum Fourier transform acts on a quantum state and maps it to a quantum state according to the formula: (Conventions for the sign of the phase factor exponent vary; here we use the convention that the quantum Fourier transform has the same effect as the inverse discrete Fourier transform, and vice versa.) Since is a rotation, the inverse quantum Fourier transform acts similarly but with: In case that is a basis state, the quantum Fourier Transform can also be expressed as the map where . We get, for example, in the case of and phase the transformation matrix Most of the properties of the quantum Fourier transform follow from the fact that it is a unitary transformation. This can be checked by performing matrix multiplication and ensuring that the relation holds, where is the Hermitian adjoint of . Alternately, one can check that orthogonal vectors of norm 1 get mapped to orthogonal vectors of norm 1. From the unitary property it follows that the inverse of the quantum Fourier transform is the Hermitian adjoint of the Fourier matrix, thus . Since there is an efficient quantum circuit implementing the quantum Fourier transform, the circuit can be run in reverse to perform the inverse quantum Fourier transform. Thus both transforms can be efficiently performed on a quantum computer. with the primitive -th root of unity. The circuit is composed of gates and the controlled version of As already stated, we assume . We have the orthonormal basis consisting of the vectors The basis states enumerate all the possible states of the qubits: where, with tensor product (or Kronecker product) notation , indicates that qubit is in state , with either 0 or 1. By convention, the basis state index is the binary number encoded by the , with the most significant bit. With this convention, we may write the quantum Fourier transform as: It is also useful to borrow fractional binary notation: With this notation, the action of the quantum Fourier transform can be expressed in a compact manner: In other words, the discrete Fourier transform, an operation on n qubits, can be factored into the tensor product of n single-qubit operations, suggesting it is easily represented as a quantum circuit (up to an order reversal of the output). In fact, each of those single-qubit operations can be implemented efficiently using a Hadamard gate and controlled phase gates. The first term requires one Hadamard gate and controlled phase gates, the next one requires a Hadamard gate and controlled phase gate, and each following term requires one fewer controlled phase gate. Summing up the number of gates, excluding the ones needed for the output reversal, gives gates, which is quadratic polynomial in the number of qubits. Consider the quantum Fourier transform on 3 qubits. It is the following transformation: where is a primitive eighth root of unity satisfying (since ). For short, setting , the matrix representation of this transformation on 3 qubits is: The 3-qubit quantum Fourier transform can be rewritten as: In the following sketch, we have the respective circuit for (with reversed order of output qubits with respect to the proper QFT): As calculated above, the number of gates used is which is equal to , for . Relation to quantum Hadamard transformEdit Using the generalized Fourier transform on finite (abelian) groups, there are actually two natural ways to define a quantum Fourier transform on an n-qubit quantum register. The QFT as defined above is equivalent to the DFT, which considers these n qubits as indexed by the cyclic group . However, it also makes sense to consider the qubits as indexed by the Boolean group , and in this case the Fourier transform is the Hadamard transform. This is achieved by applying a Hadamard gate to each of the n qubits in parallel. Note that Shor's algorithm uses both types of Fourier transforms, both an initial Hadamard transform as well as a QFT. - Coppersmith, D. (1994). "An approximate Fourier transform useful in quantum factoring". Technical Report RC19642, IBM. - Michael Nielsen and Isaac Chuang (2000). Quantum Computation and Quantum Information. Cambridge: Cambridge University Press. ISBN 0-521-63503-9. OCLC 174527496. - Hales, L.; Hallgren, S. (November 12–14, 2000). "An improved quantum Fourier transform algorithm and applications". Proceedings 41st Annual Symposium on Foundations of Computer Science: 515–525. CiteSeerX 10.1.1.29.4161. doi:10.1109/SFCS.2000.892139. ISBN 0-7695-0850-2. S2CID 424297. - Fourier Analysis of Boolean Maps– A Tutorial –, pp. 12-13 - Lecture 5: Basic quantum algorithms, Rajat Mittal, pp. 4-5
<urn:uuid:19c34eb4-dc8b-4479-9bbf-ea59958c499a>
CC-MAIN-2022-33
https://en.m.wikipedia.org/wiki/Quantum_Fourier_transform
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570871.10/warc/CC-MAIN-20220808183040-20220808213040-00658.warc.gz
en
0.855992
1,482
3.546875
4
CLASSIC MAGIC TRICK MAY ENABLE QUANTUM COMPUTING A new project will use the electric field in an accelerator cavity to try to levitate a tiny metallic particle, allowing it to store quantum information Quantum computing could solve problems that are difficult for traditional computer systems. It may seem like magic. One step toward achieving quantum computing even resembles a magician’s trick: levitation. A new project at the U.S. Department of Energy’s Thomas Jefferson National Accelerator Facility will attempt this trick by levitating a microscopic particle in a superconducting radiofrequency (SRF) cavity to observe quantum phenomena. This is a line drawing of an accelerator cavity that will be used in a proof of principle project that aims to levitate a tiny metallic particle, allowing it to store quantum information. Typically at Jefferson Lab and other particle accelerator facilities, SRF cavities enable studies of the atom’s nucleus. They do this by accelerating subatomic particles, such as electrons. This project will use the same type of cavity to instead levitate a microscopic particle of metal, between 1 and 100 micrometers in diameter, with the cavity’s electric field. “No one has ever intentionally suspended a particle in an electric field in a vacuum using SRF cavities,” said Drew Weisenberger, a principal investigator on this project, as well as Chief Technology Officer and head of the Radiation Detector and Imaging Group in the Experimental Nuclear Physics Division at Jefferson Lab. If the project team is able to levitate a particle, they might be able to then impart a quantum state on it by cooling the trapped particle to its lowest possible energy level (because that’s when quantum properties occur). “Storing quantum information on a levitated nanoparticle is our ultimate goal, but for now, it is a proof of principle experiment,” said Pashupati Dhakal, another principal investigator on the project and a staff scientist at Jefferson Lab in the Accelerator Operations, Research and Development Division. “We want to know if we can trap and levitate particles inside the cavity using the electric field.” Exploring the Quantum with Accelerator Cavities The idea for this project came from observations of accelerator experts. They think they have already unintentionally levitated unwanted and rare nanoparticles of metal, such as niobium and iron, inside SRF cavities during particle accelerator operations. They suspect that this unintentional levitation has impacted the performance of SRF cavity components. Researchers are attempting to use a several-decades-old technique called “laser trapping”, as a step toward reliably imparting a quantum state on a particle suspended in a laser beam. But, the Jefferson Lab project team thinks that SRF cavities may provide a better tool for those researchers. “An electric field could go potentially beyond the capabilities of laser trapping,” Weisenberger said. Intrinsic characteristics of SRF cavities will overcome some limits of laser trapping. A levitated particle in an SRF cavity that is under vacuum and chilled to super cold temperatures will only interact with the cavity’s electric field and not lose information to the outside, which is important for maintaining a quantum state. “Like storing information on a computer chip, the quantum state will stay and not dissipate,” Weisenberger said. “And that could eventually lead to applications in quantum computing and quantum communications.” This project, titled “SRF Levitation and Trapping of Nanoparticles Experiment,” is funded by the Laboratory Directed Research & Development program, which provides resources for Jefferson Lab personnel to make rapid and significant contributions to critical science and technology problems relevant to the mission of Jefferson Lab and the DOE. A Multidisciplinary Approach The project was conceived and launched by Rongli Geng in October 2021 before he transitioned to Oak Ridge National Laboratory. It has now shifted to a larger and more multi-disciplinary team led by Weisenberger and Dhakal, the current co-principal investigators. Weisenberger’s team researches detector technology for nuclear physics research, whereas Dhakal’s work focuses on developing SRF cavities to accelerate electrons at high speeds. Weisenberger says that the multidisciplinary approach will bring together their expertise as they branch out together into the less familiar territory of this LDRD project. Both principal investigators remark that the project is moving forward well, thanks to the diligence and expertise supplied by every member of the team. Team members include John Musson, Frank Marhauser, Haipeng Wang, Wenze Xi, Brian Kross and Jack McKisson. “It’s an interesting step outside of the usual things that we do,” Weisenberger said. “The LDRD program lets loose Jefferson Lab scientists and engineers on a research question that isn’t directly related to what we’re actually hired to do, but is making use of all the expertise that we bring and it’s a great resource to tap to try to stretch. That’s what we’re doing with this project, stretching.” Building and Testing Before turning the project over the Weisenberger and Dhakal, Geng and his colleagues had determined the required parameters of the cavity and electric field with simulations and calculations. “We have everything on paper but we have to make it into a reality,” Dhakal said. The team is currently setting up the experiment in real life. “We have to see if what was simulated can actually happen,” Weisenberger said. First, they’ll assemble a mock-up of the experiment at room temperature. Then, they’ll circulate liquid helium around the outer surfaces of the cavity to cool it to superconducting temperatures approaching absolute zero. Next comes the most difficult part. They must get a single microscopic particle in the correct region of the cavity while the cavity is locked up inside a containment vessel at superconducting temperatures, under vacuum, and with the electric field on. “We’ve come up with a way to remotely launch a particle in the cavity under experimental conditions, we just have to test it now,” Weisenberger said. “In the research and development world, you often can’t do what you thought you could do. We try and test and run into problems, try to solve the problems, and keep going.” This is a year-long project with the possibility of another year of funding, depending on how things go. It is also an early stage, proof of principle project. If it is ultimately successful, there would still be a long road of R&D before the concepts could be applied toward building quantum computers. Such computers would require levitating and imparting quantum states on tens to hundreds to thousands of much smaller particles predictably and reliably. Still, the researchers are looking forward to the discoveries they hope this study will enable regarding microscopic particle levitation and potential observation of a quantum state. “I’m optimistic,” Dhakal said. “Either way, we’ll discover something. Failure is just as much a part of R&D as success. You learn from both. Basically, whether the particle levitates or not, or whether we can impart the quantum state to it or not, it’s something that’s never been done before. It’s very challenging and exciting.” The team already has a research paper in the works for this project, but only time will tell whether they can realize this bit of magic in the laboratory. Content may have been edited for style and clarity.
<urn:uuid:3243b045-0e7b-495f-aaae-a818a3e6ae26>
CC-MAIN-2022-33
https://qubitreport.com/quantum-computing-science-and-research/2021/08/21/classic-magic-trick-may-enable-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570793.14/warc/CC-MAIN-20220808092125-20220808122125-00058.warc.gz
en
0.92614
1,613
3.578125
4
Quantum computing is here to shake the existing mechanical, electrical and electronic systems. Modern electronics in particular will not be the same if quantum computing gains acceptance. Therere voices of support as well as dissent. In this post, well analyze future trends in quantum computing. Keep reading! Quantum computers use atoms to perform calculation. The computation speed depends principally on Qubits (quantum bits). These quantum bits are the fundamental building blocks of a quantum computer. Recent developments in the field of quantum research expect to eliminate/drop the Moore’s law by 2020. The future of Quantum computers as of now is not very certain particularly due to already known problems in areas such as de-coherence, error correction, output observance and cost related issues. But, if scientists succeed in developing a practically useful quantum computer, it may replace traditional computers in sectors such as robotics (Industrial Automation), cyber security, alternative energy etc. Such computers may also be deployed for solving emerging tactical problems like Tsunami alerts. Quantum computers can scale up the possibility of enhancing computation power to a new and unanticipated peak point by providing a fast and efficient platform for high performance computing. At present, we don’t have very efficient systems capable of solving tactical problems such as Correct weather forecasting Predicting right patterns in stock markets Analyzing the molecular/ DNA part of human body in medical research. Today, processor die size is drastically shrinking, but there is not enough software solutions developed for harnessing the full processor potential. Computing power over the next few years will perhaps get skyrocketed with the advent of quantum computers. Many experts argue that the computing world today doesn’t even have the right programs to actually utilize a 1 GHz mobile processor in the best possible way. It’s not more processor speed but better programs we need urgently right now, or is it? Have a look at some areas where quantum computers can play a vital role in near future: Artificial intelligence (AI) was primarily meant to assist humans in executing complex jobs such as handling operations in the middle of a furnace blast or during space and military missions. Today, robotic systems are heavily used in the industrial automotive world for boosting production. Introduction of quantum computing can give a major boost to AI by ensuring creation of even more powerful & intelligent robots. The capability of encoding information in fuzzy quantum states will multiply the power of these artificial creatures. It would be possible to scan through large databases in few seconds with qubits. Quantum AI techniques can dramatically speed up image acquisition and processing techniques. Algorithms have already been developed and ready for implementation in quantum computers now. But recent failures in controlling Qubits inside laboratories, pose serious questions regarding the viability of quantum computing. Developed robots featuring powerful qubit will be able to break maximum encryption code within near zero time. A quantum computer will possibly crack any possible password in an instant. No security algorithm will then be able to provide 100% security to content placed on web servers. As far as the Internet is concerned, everything (yes, everything) will have to be redefined using quantum computers. Qubits (known as Quantum dots in solar terminology) can be largely deployed in solar panels to replace the current photovoltaic cells technology. Quantum dot is a nanoscale particle of semiconducting material that can be embedded. It can therefore revolutionize the renewable energy sector. Qubits can also be used to make quantum batteries in order to store energy generated by powerful windmills. Teleportation (if it ever becomes a reality) will allow transfer of matter from one place to another without traversing through physical medium. With this technology, (some say) time travelling can become possible which still is considered a myth. Quantum teleportation technology will enable humans to travel far distances without losing a moment as seen in fictional/sci-fi movies. Right now, it’s all speculation. Quantum computers can be connected in series to form a quantum network, thus building a smart grid. They will offer high encoding and decoding speeds with fast transfer of information (qubits). Smart energy grids will offer high efficiency in energy delivery system. Additionally, quantum computers can also be used to process large amount of data coming from geothermal activities. The already developed and much touted quantum computer from ‘D-Wave’ systems is 3600 times powerful than a conventional PC. But the project was declared a failure on application front by Google. Questions about the real-world feasibility of such expensive projects remain unanswered. But, given the fact that everything from cellphones, wireless networks and electricity was no less than a miracle few dozen years ago, quantum computing too may appear as a miracle at first and slowly become an integral part of our lives. About Amy Baker A computer science engineer, Amy holds a Masters Degree in Quantum Computing. She is based in Texas. The content & opinions in this article are the author’s and do not necessarily represent the views of RoboticsTomorrow This post does not have any comments. Be the first to leave a comment below. Post A Comment You must be logged in before you can post a comment. Login now.
<urn:uuid:349e3b87-7b1e-4eb6-afe8-49b55ae6580e>
CC-MAIN-2022-33
https://www.roboticstomorrow.com/article/2014/02/an-uncertain-future-for-quantum-computing/235/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573399.40/warc/CC-MAIN-20220818185216-20220818215216-00459.warc.gz
en
0.918999
1,107
3.546875
4
One day in 1900, German physicist Max Planck told his son that he had made a breakthrough as important as Isaac Newton’s discovery of the workings of the universe. Planck had reached the surprising conclusion that light behaves as if it is packaged in discrete amounts, or quanta, a seemingly simple observation that would lead to a powerful new field of physics called quantum mechanics. Quantum mechanics is the most successful physical theory ever devised, and you learn what distinguishes it from its predecessor, classical mechanics. Professor Schumacher explains his ground rules for the course, which is designed to teach you some of the deep ideas and methods of quantum mechanics. You investigate the age-old debate over whether the physical world is discrete or continuous. By the 19th century, physicists saw a clear demarcation: Matter is made of discrete atoms, while light is a continuous wave of electromagnetic energy. However, a few odd phenomena remained difficult to explain. At the beginning of the 20th century, Max Planck and Albert Einstein proposed revolutionary ideas to resolve puzzles about light and matter. You explore Planck's discovery that light energy can only be emitted or absorbed in discrete amounts called quanta, and Einstein's application of this concept to matter. Light propagates through space as a wave, but it exchanges its energy in the form of particles. You learn how Louis de Broglie showed that this weird wave-particle duality also applies to matter, and how Max Born inferred that this relationship makes quantum mechanics inherently probabilistic. You explore the mystery of why atoms are stable. Niels Bohr suggested that quantum theory explains atomic stability by allowing only certain distinct orbits for electrons. Erwin Schrödinger discovered a powerful equation that reproduces the energy levels of Bohr's model. One of the most famous and misunderstood concepts in quantum mechanics is the Heisenberg uncertainty principle. You trace Werner Heisenberg's route to this revolutionary view of subatomic particle interactions, which establishes a trade-off between how precisely a particle's position and momentum can be defined. You focus on the Einstein-Bohr debate, which pitted Einstein's belief that quantum events can, in principle, be known in every detail, against Bohr's philosophy of complementarity—the view that a measurement of one quantum variable precludes a different variable from ever being known. Beginning his presentation of quantum mechanics in simplified form, Professor Schumacher discusses the mysteries and paradoxes of the Mach-Zehnder interferometer. He concludes with a thought experiment showing that an interferometer can determine whether a bomb will blow up without necessarily setting it off. The interferometer from the previous lecture serves as a test case for introducing the formal math of quantum theory. By learning a few symbols and rules, you can describe the states of quantum particles, show how these states change over time, and predict the results of measurements. Many quantum particles move through space and also have an intrinsic spin. Analyzing spin gives you a simple laboratory for exploring the basic ideas of quantum mechanics, and it is one of your key tools for understanding the quantum world. Macroscopic objects obey the snowflake principle. No two are exactly alike. Quantum particles do not obey this principle. For instance, every electron is perfectly identical to every other. You learn that quantum particles come in two basic types: bosons, which can occupy the same quantum state; and fermions, which cannot. You discover that the tendency of bosons to congregate in the same quantum state can lead to amazing applications. In a laser, huge numbers of photons are created, moving in exactly the same direction with the same energy. In superconductivity, quantum effects allow electrons to flow forever without resistance. Why is matter solid, even though atoms are mostly empty space? The answer is the Pauli exclusion principle, which states that no two identical fermions can ever be in the same quantum state. At the fundamental level, bosons and fermions differ in a single minus sign. One way of understanding the origin of this difference is with the Feynman ribbon trick, which Dr. Schumacher demonstrates. When two particles are part of the same quantum system, they may be entangled with each other. In their famous "EPR" paper, Einstein and his collaborators Boris Podolsky and Nathan Rosen used entanglement to argue that quantum mechanics is incomplete. You chart their reasoning and Bohr's response. Thirty years after EPR, physicist John Bell dropped an even bigger bombshell, showing that a deterministic theory of quantum mechanics such as EPR violates the principle of locality—that particles in close interaction can't be instantaneously affected by events happening in another part of the universe. Feynman diagrams are a powerful tool for analyzing events in the quantum world. Some diagrams show particles moving forward and backward in time, while other particles appear from nowhere and disappear again. All are possible quantum scenarios, which you learn how to plot. The quantum vacuum is a complex, rapidly fluctuating medium, which can actually be observed as a tiny attraction between two metal plates. You also discover that vacuum energy may be the source of the dark energy that causes the universe to expand at an ever-accelerating rate. You explore quantum information and quantum computing—Dr. Schumacher's specialty, for which he pioneered the concept "qubit," the unit of quantum information. You learn that unlike classical information, such as a book or musical recording, quantum information can't be perfectly copied. The uncopyability of quantum information raises the possibility of quantum cryptography—an absolutely secure method for transmitting a coded message. This lecture tells how to do it, noting that a handful of banks and government agencies already use quantum cryptography to ensure the security of their most secret data. What are the laws governing quantum information? Charles Bennett has proposed basic rules governing the relationships between different sorts of information. You investigate his four laws, including quantum teleportation, in which entanglement can be used to send quantum information instantaneously. You explore the intriguing capabilities of quantum computers, which don't yet exist but are theoretically possible. Using the laws of quantum mechanics, such devices could factor huge numbers, allowing them to easily decipher unbreakable conventional codes. What is the fundamental nature of the quantum world? This lecture looks at three possibilities: the Copenhagen, hidden-variable, and many-worlds interpretations. The first two reflect Bohr's and Einstein's views, respectively. The last posits a vast, multivalued universe encompassing every possibility in the quantum realm. In this final lecture, you ponder John A. Wheeler's metaphor of the Great Smoky Dragon, a creature whose tail appears at the start of an experiment and whose head appears at the end. But what lies between is as uncertain as the mysterious and unknowable path of a quantum particle. Kuratiert wunderbare wissenschaftliche Materialien für Menschen. Dokumentationen, Vorträge und Filme. Alles handelsfrei.
<urn:uuid:8ff0edbf-8bb3-4ddc-a6dd-9f2f657cb647>
CC-MAIN-2022-33
https://www.videoneat.com/de/courses/20424/quantum-mechanics-the-physics-of-the-microscopic-world/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572063.65/warc/CC-MAIN-20220814173832-20220814203832-00061.warc.gz
en
0.91595
1,466
4.1875
4
Researchers from MIT, Google, and elsewhere have designed a novel method for verifying when quantum processors have accurately performed complex computations that classical computers can’t. They validate their method on a custom system (pictured) that’s able to capture how accurately a photonic chip (PNP) computed a notoriously difficult quantum problem. Image © Mihika Prabhu. Last year on October 23, 2019, a research paper was published in the journal Nature stating that the quantum speedup is achievable in a real-world system. Quantum computer is superior to classical computers that work on the principle of binary code; 0 and 1. Quantum computers use quantum bits (qubits) and qubits can represent both a 0 and 1 at the same time. Though a quantum computer may work on the state between 1 and 0, when qubits are measured the result is always either a 0 or a 1. But how do we verify that the quantum chip- a crucial technical part of a quantum computer, is working correctly? But not to worry, a team of researchers from MIT has described a novel protocol to efficiently verify that a Noisy Intermediate Scale Quantum (NISQ) chip has performed all the right quantum operations. The research work was published in the journal Nature Physics. Quantum chips perform computations using quantum bits, called qubits. Qubits can represent the two states corresponding to classic binary bits - a 0 or 1 - or both states simultaneously called the quantum superposition of both states. The unique superposition state is where the quantum computer shows its prowess: superposition states enable quantum computers to solve complex problems that are practically impossible for classical computers and potentially serve as a breakthrough in material design, drug discovery, and machine learning, among other applications. But achieving a fully workable quantum computer is not an easy task; full-scale quantum computers will require millions of qubits, which isn’t yet feasible. Scientists have been working on designing feasible quantum chips for the past years and recently they are busy developing Noisy Intermediate Scale Quantum (NISQ) chips that can contain around 50 to 100 qubits. The chip’s outputs can look entirely random, so it takes a long time to simulate steps to determine if everything went according to plan. And here the team validates their protocol on a notoriously difficult quantum problem running on a custom quantum photonic chip and successfully described a novel protocol to efficiently verify that a NISQ chip has performed all the right quantum operations indicating that they just do not operate randomly. Jacques Carolan, first author and a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE) said, “As rapid advances in industry and academia bring us to the cusp of quantum machines that can outperform classical machines, the task of quantum verification becomes time-critical. Our technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting.” The basic idea of testing the quantum chip was easy; they just fed an output quantum state generated by the quantum circuit back to a known input state. By this, they were able to diagnose which circuit operations were performed on the input to produce the output since those operations should always match what researchers programmed and if not, the researchers can use the information to pinpoint where things went wrong on the chip. The main idea of the new protocol was to divide and conquer where instead of doing whole thing in one shot, which takes a very long time, they do this unscrambling layer by layer. This divide and conquer rule allowed researchers to break problems and tackle them in a more efficient way. The idea of divide and conquer was inspired by the working of neural networks - which solve problems through many layers of computation - and successfully build a novel quantum neural network (QNN) where each layer represents a set of quantum operations. To run the QNN, they used traditional silicon fabrication techniques to build a 2-by-5-millimetre NISQ chip with more than 170 control parameters. The control parameters were tunable circuit components and can manipulate the photon path easier. Then a pair of photons having specific wavelengths were generated from an external component and injected into the chip and those photons travel through the chip’s phase shifters. The phase shifter was used to change the path of photons. This phenomenon eventually produces a random quantum output state which represents what would happen during computation. Now the output was measured by an array of external photodetector sensors. Then the output was sent to QNN where the first layer uses complex optimization techniques to dig through the noisy output and pinpoint the signature of a single photon among all those scrambled together. Then as required it unscrambles that single photon from the group to identify what circuit operations return it to its known input state. Those operations should match exactly the circuit’s specific design for the task. Here all the subsequent layers do the same computation - removing from the equation any previously unscrambled photons - until all photons are unscrambled. For instance, let’s say the input state of qubits fed into the processor was all zeroes and the NISQ chip executes a bunch of operations on the qubits to generate a massive, seemingly randomly changing number as output. An output number will constantly be changing as it’s in a quantum superposition. Layer by layer now the QNN selects chunks of that massive number and then and determines which operations revert each qubit back down to its input state of zero. If any operations are different from the originally planned operations, then researchers will know that something has gone awry. Researchers can inspect any mismatches between the expected output to input states, and use that information to tweak the circuit design. Researchers were able to unsample two photons that had run through the boson sampling problem on their custom NISQ chip - and in a fraction of time it would take traditional verification approaches and also claimed that in addition for quantum verification purposes, this process helps to capture useful physical properties.
<urn:uuid:5f7bc160-7619-4cf6-b0eb-26dba5b58cae>
CC-MAIN-2022-33
https://physicsfeed.com/post/method-verify-whether-quantum-chips-are-accurately-executing-operations/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570765.6/warc/CC-MAIN-20220808031623-20220808061623-00661.warc.gz
en
0.94182
1,256
3.65625
4
With the advent of quantum computing, the need for peripheral fault-tolerant logic control circuitry has reached new heights. In classical computation, the unit of information is a “1” or “0”. In quantum computers, the unit of information is a qubit which can be characterized as a “0”, “1”, or a superposition of both values (known as a “superimposed state”). The control circuitry in classical computers is CMOS (semiconductor) based, due to its high-performance and low power dissipation. The “1’s” and “0’s” of a classical computer can be manipulated, stored, and easily read using CMOS chips that operate at room temperature. Most quantum computers today operate at cryogenic temperatures, to ensure that the qubit remains coherent (in a superimposed state) for as long as possible. The coherence times are typically very short (nanoseconds to milliseconds) in a quantum computer, prompting the need for control circuitry that can perform high-speed, fault-tolerant operations. This requirement could be met by conventional CMOS control circuitry if it could be operated at cryogenic temperatures. The first attempt to characterize semiconductor materials at cryogenic temperatures was made by A.K. Jonscher in his 1964 Proceedings of the IEEE publication, entitled “Semiconductors at Cryogenic Temperatures” . His two basic conclusions were: 1) semiconductor devices have no major cryogenic application at that point in time due to “no real technological justification for going on a large scale to these extreme temperatures”, and 2) “the properties of semiconductor materials at cryogenic temperatures are so strikingly different from the familiar properties at higher temperatures, that it is reasonable to expect many more device applications to emerge as a result of continued research and development effort in this direction”. A few years later, IBM became interested in low-temperature semiconductor device operation [2-3] and concluded that MOSFET semiconductor devices show improved performance at cryogenic temperatures. With the advantages of low-temperature operation, scaling-down the cooling apparatus is still an obstacle in using semiconductor-based control circuitry. Enter quantum mechanics. In 1959, Richard Feynman challenged the scientific community to employ quantum mechanics in the design of information processing systems. He envisioned new information systems and functions that involved quantized energy levels, and/or the interactions of quantized “spins” (angular momentum of quantum particles). His vision was realized in the 1980s, when it was demonstrated that quantum mechanical, energy-based equations could represent a universal Turing (computational) machine . In 1994, it was shown that a quantum computer could factor integer numbers much more quickly than a classical computer (“in polynomial time”) . This discovery was the catalyst that fostered continued interest in building quantum computing systems. That interest continues today at numerous commercial, research and academic organizations. Even with the strong interest in building quantum computers, the fact remains that successful operation of this type of computer currently requires a cryogenic temperature environment. Quantum logic control circuitry will also need to operate at these cryogenic temperatures to function effectively in this environment. Thus, we have seen a resurgence of interest in the cryogenic temperature performance of CMOS-based circuitry. Quantum computers do not require state-of-the-art CMOS circuitry, but CMOS devices operate differently at cryogenic and room temperatures. CMOS transistor performance (and the associated I-V performance) has recently been measured on 40 nm and 160 nm bulk CMOS devices, at both room temperature and at 4.2 degrees Kelvin (see Figure 1). Drive current increases at cryogenic temperatures due to an increase of the mobility in silicon at these temperatures. Unfortunately, other effects such as substrate freeze-out can limit the increase in drive current at these low temperatures.Control circuitry for quantum computers is currently being operated at room temperature. As mentioned earlier, this can be a problem due to the sensitivity of reading the “state” of qubits at higher temperatures. This challenge can be partially alleviated by operating the CMOS circuitry at or near cryogenic temperatures, in the same cryogenic freezers as the quantum computer. This integration can serve to reduce latency and increase overall system scalability. Despite some second order issues, CMOS transistors at low temperatures can perform various functions needed to work with a quantum computer. These functions include the ability to perform as I/V converters, low-pass filters, and A/D and D/A converters (see Figure 2). To achieve the desired performance of a fault-tolerant quantum computer system, a new generation of deep-submicron CMOS circuits will be required that operate at deep-cryogenic temperatures . Extrapolating this idea to its logical conclusion, one ends up with a quantum integrated circuit (QIC) where the array of qubits is integrated on the same chip as the CMOS electronics required to read the state of the qubits. This integration would clearly be the ultimate goal in achieving scalable, reliable, and high performing quantum computing. In more futuristic applications, optical communications to and from the qubit may also be necessary. In this case, integrated CMOS circuits will also need to include micro- and nano-optical structures, such as light-guides and interferometers. These types of optical functions have been successfully demonstrated on room-temperature CMOS devices. Demonstrating this level of optical communications functionality at cryogenic temperatures may also be desirable in future quantum computing applications.
<urn:uuid:2c2c5e3e-82ea-4b3e-bef0-e5233452ec18>
CC-MAIN-2022-33
https://www.coventor.com/blog/quantum-computers-cmos-semiconductors-review-future-predictions/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573908.30/warc/CC-MAIN-20220820043108-20220820073108-00663.warc.gz
en
0.927405
1,176
3.5625
4
The quantum internet will change the world -- it will unlock applications ranging from ultra-secure communication to high-performance AI systems to unprecedented medical images. So what’s stopping us? Building a global quantum internet from today’s laboratory experiments requires the ability to transmit qubits over long distances. Quantum repeaters are the key to unlocking this. Quantum repeaters vs. Classical repeaters Before we look at the role of quantum repeaters in the quantum internet, let’s consider an analogous device -- a non-quantum, or “classical,” repeater. The Internet transfers information in the form of bits along fiber optic cables. Some of these cables travel long distances, such as the SEA-ME-WE 3 undersea cable that reaches from Germany to Japan. However, as light passes through these fibers, it suffers from loss, or “attenuation,” as photons are absorbed by the fiber. To account for this, a “repeater” is inserted between nodes. Repeaters simply measure the signal coming in from one side, copy it, and retransmit it at higher power to the other side. As a result, the quantum internet is able to transmit information reliably over very long distances. Loss is a problem in quantum networks as well, but unfortunately the same technique of measuring, copying, and retransmitting doesn’t translate to the quantum communications realm. This is due to a fundamental aspect of quantum information -- it cannot be copied. This fact is known as the no-cloning theorem. It turns out that we can’t measure quantum states on their way from point A to point B without destroying them. This actually provides some of the amazing benefits of quantum communications, like ultra-secure communication, but also means that we can’t use the same idea from classical repeaters to avoid loss in quantum channels. So, how can we avoid the problem of loss in a quantum network? How quantum repeaters work Despite their name, quantum repeaters actually use a very different strategy than classical repeaters to handle the problem of loss. The core idea is based on the technique of entanglement swapping. The primary goal of quantum networks is to distribute entanglement between members of the network. Entanglement distribution unlocks all kinds of applications, including even transmitting qubits. Entanglement swapping is a clever idea that gets around the problem of loss without violating the no-cloning theorem. Entanglement swapping uses teleportation to create long-distance entanglement from a chain of locally connected repeaters Entanglement swapping works by generating a single long-distance entanglement from many short-distance entanglements. One of the biggest obstacles to distributing long-distance entanglement is the exponential loss incurred due to fiber attenuation. Say Alice and Bob are connected by a fiber that is too long to transmit photons at a reasonable rate. They can add a repeater in the middle that instead accepts entangled photons from both Alice and Bob and then converts those into entanglement between Alice and Bob. In this way, the photons only need to travel half the distance and have a higher chance of making it all the way to their destination. While the act of “gluing” together two separate entanglement links may sound magical, the repeater can do this using a simple operation called teleportation. As long as the repeater has qubits that are entangled with pairs at each of Alice and Bob, it can perform a measurement and report to Alice and Bob the information they need to use their newly entangled connection. By building up a chain of repeaters, we can break down long distances into more manageable segments over which to send our photons. Teleportation between two nodes has been experimentally demonstrated by many different research groups, in many different scenarios (through a free-space link over 143 kilometers, across the Danube, and over a ground-to-satellite uplink). Most recently, Caltech demonstrated teleportation using telecom wavelengths, the wavelength of choice for building a quantum internet on top of existing classical infrastructure. So if we already have such a plethora of successful quantum teleportation experiments, why can’t we build real quantum repeaters? Well, efforts are already underway for some early demonstrations. However, the first repeaters need to be designed to handle the limitations of current devices. In fact, a timeline of repeater technology has emerged, separating repeaters into three categories: 1st generation, 2nd generation, and 3rd generation. These generations do not necessarily make each other obsolete, but they show how networks can expand to support increasingly powerful applications as technology improves. Three generations of quantum repeaters Image adapted from: Muralidharan, S., Li, L., Kim, J. et al. Optimal architectures for long distance quantum communication. Sci Rep 6, 20463 (2016). https://doi.org/10.1038/srep20463 1st Generation Repeaters Repeaters need to rely on quantum processors to accomplish their jobs. However, today’s quantum processors are very error prone. To make up for this, 1st generation repeaters will use a process called entanglement distillation. The idea behind entanglement distillation is that you can “distill” a high quality entanglement from many copies of low quality entanglement. While a network with 1st generation repeaters will enable some groundbreaking applications, it’s communication rate is highly limited by the process of distillation. 2nd Generation Repeaters As error rates improve, quantum repeaters can transition from relying on entanglement distillation to quantum error correction to handle operation errors. Quantum error correction handles errors by encoding information into blocks of qubits, where errors can more easily be handled. This will allow networks to transfer information at much higher speeds and enable further applications. 3rd Generation Repeaters Finally, once quantum devices have improved enough, quantum error correction will be able to be used to handle both loss and operation errors. Essentially, this allows nodes to trust that their information will travel safely to other nodes, without having to listen to hear from each repeater that entanglement was established. This will have a huge improvement on the rate of communication and unlock even more applications. A global quantum internet using repeaters will enable game-changing applications Quantum networks are already under development! For example, the Center for Quantum Networks, hosted at the University of Arizona, plans to develop the first quantum network enabling fully error-corrected quantum connectivity, enabled by quantum repeaters. Similar efforts are underway in National Labs and at Universities across the United States and the globe. Developing working quantum repeaters will be a key to the success of these efforts.
<urn:uuid:fe52c80d-9223-45ea-b6a5-be6add52c991>
CC-MAIN-2022-33
https://www.aliroquantum.com/blog/what-are-quantum-repeaters
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882570868.47/warc/CC-MAIN-20220808152744-20220808182744-00463.warc.gz
en
0.911038
1,398
3.65625
4
A proof-of-concept published today in Nature promises warmer, cheaper and more robust quantum computing. And it can be manufactured using conventional silicon chip foundries. Most quantum computers being developed around the world will only work at fractions of a degree above absolute zero. That requires multi-million-dollar refrigeration and as soon as you plug them into conventional electronic circuits they’ll instantly overheat. But now researchers led by Professor Andrew Dzurak at UNSW Sydney have addressed this problem. “Our new results open a path from experimental devices to affordable quantum computers for real world business and government applications,” says Professor Dzurak. The researchers’ proof-of-concept quantum processor unit cell, on a silicon chip, works at 1.5 Kelvin – 15 times warmer than the main competing chip-based technology being developed by Google, IBM, and others, which uses superconducting qubits. “This is still very cold, but is a temperature that can be achieved using just a few thousand dollars’ worth of refrigeration, rather than the millions of dollars needed to cool chips to 0.1 Kelvin,” explains Dzurak. “While difficult to appreciate using our everyday concepts of temperature, this increase is extreme in the quantum world.” Quantum computers are expected to outperform conventional ones for a range of important problems, from precision drug-making to search algorithms. Designing one that can be manufactured and operated in a real-world setting, however, represents a major technical challenge. The UNSW researchers believe that they have overcome one of the hardest obstacles standing in the way of quantum computers becoming a reality. In a paper published in the journal Nature today, Dzurak’s team, together with collaborators in Canada, Finland and Japan, report a proof-of-concept quantum processor unit cell that, unlike most designs being explored worldwide, doesn’t need to operate at temperatures below one-tenth of one Kelvin. Dzurak’s team first announced their experimental results via the academic pre-print archive in February last year. Then, in October 2019, a group in the Netherlands led by a former post-doctoral researcher in Dzurak’s group, Menno Veldhorst, announced a similar result using the same silicon technology developed at UNSW in 2014. The confirmation of this ‘hot qubit’ behaviour by two groups on opposite sides of the world has led to the two papers being published ‘back-to-back’ in the same issue of Nature today. Qubit pairs are the fundamental units of quantum computing. Like its classical computing analogue – the bit – each qubit characterises two states, a 0 or a 1, to create a binary code. Unlike a bit, however, it can manifest both states simultaneously, in what is known as a “superposition”. Cheaper and easier to integrate The unit cell developed by Dzurak’s team comprises two qubits confined in a pair of quantum dots embedded in silicon. The result, scaled up, can be manufactured using existing silicon chip factories, and would operate without the need for multi-million-dollar cooling. It would also be easier to integrate with conventional silicon chips, which will be needed to control the quantum processor. A quantum computer that is able to perform the complex calculations needed to design new medicines, for example, will require millions of qubit pairs, and is generally accepted to be at least a decade away. This need for millions of qubits presents a big challenge for designers. “Every qubit pair added to the system increases the total heat generated,” explains Dzurak, “and added heat leads to errors. That’s primarily why current designs need to be kept so close to absolute zero.” The prospect of maintaining quantum computers with enough qubits to be useful at temperatures much colder than deep space is daunting, expensive and pushes refrigeration technology to the limit. The UNSW team, however, have created an elegant solution to the problem, by initialising and “reading” the qubit pairs using electrons tunnelling between the two quantum dots. The proof-of-principle experiments were performed by Dr Henry Yang from the UNSW team, who Dzurak describes as a “brilliant experimentalist”. The Latest Updates from Bing News & Google News Go deeper with Bing News on: - Quantum control for advanced technology: Past and present Quantum devices are a promising technological advance for the future, but this will hinge on the application of quantum optimal control top real-world devices. A new review looks at the status of the ... - Why Israel is moving to quantum computing The Israel Innovation Authority (IIA) has selected Quantum Machines to establish its national Quantum Computing Center. - Computational power unleashed with quantum digits In a recent study published in Nature Physics, researchers at the University of Innsbruck, Austria, have unleashed the hidden computational resources that | Technology ... - Bosch’s new partnership aims to explore quantum digital twins Bosch's partnership with Multiverse Computing is an example of how many legacy companies are exploring quantum computing today to prepare for more capable hardware. - NRL Launches Quantum Navy YouTube Series The Great Power Competition Mark your calendars! Quantum Navy, a new three-part video series, is coming to our YouTube channel Wednesday, Aug. 3. WASHINGTON, Aug. 01, 2022 (GLOBE NEWSWIRE) -- The U.S. Go deeper with Google Headlines on: Go deeper with Bing News on: Practical quantum computers - NetworkNewsAudio – DPCM Capital Inc. (NYSE: XPOA) Gaining Recognition for Quantum Computing Products, Services To view the full editorial, visit To hear the NetworkNewsAudio version, visit About DPCM Capital Inc. DPCM Capital, a special purpose acquisition company, on February 9, 2022, announced its entry ... - How Can Governments Use Quantum Computing for Public Sector? With a global focus on quantum technology, can quantum computing technologies address public sector problems today? Guest blog by D-Wave Systems ... - Priming your business for the new age of quantum computing Last year also saw the establishment of the National Quantum Computing Centre (NQCC). Commenting on EY’s research when it was published, Dr Simon Plant, deputy director for Innovation at the NQCC, ... - This Startup Raised $9 Million To Make Better Quality Quantum Computers While there are quantum computers out there in operation, developed by companies like D-Wave, Rigetti, IBM and Google, none have the capability to solve practical problems any faster than a ... - Researchers create order from quantum chaos Pair Spin Signatures From Macroscopically Aligned Heteroacenes in an Oriented Single Crystal," National Renewable Energy Laboratory (NREL) researchers Brandon Rugg, Brian Fluegel, Christopher Chang, ...
<urn:uuid:e2fde985-e012-4944-9bc8-ec11d16c0aa1>
CC-MAIN-2022-33
https://innovationtoronto.com/2020/04/breaking-one-of-the-biggest-constraints-on-the-way-to-practical-quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572898.29/warc/CC-MAIN-20220817092402-20220817122402-00265.warc.gz
en
0.907565
1,483
3.78125
4
Rachel Goldman’s lab is working to produce “designer alloys” with carefully tailored electrical and light-absorbing properties. These materials could one day be used to build solar cells with double the efficiency of the flat-panel silicon cells that dot rooftops today. The new cells, called concentrator photovoltaics, use gallium arsenide semiconductors instead of the silicon-based semiconductors used in today’s cells. Gallium arsenide could move us toward the utility-scale solar arrays we’ll need to make solar energy a large part of our electrical infrastructure. In her most recent paper, Goldman and her collaborators moved forward the science by figuring out how incorporating small fractions of nitrogen and bismuth in gallium arsenide semiconductors affects their structure and light-absorbing properties, creating a new map for bandgap engineering of designer semiconductor alloys. The advance could accelerate the development of concentrator photovoltaics, and could also lead to advances in semiconductor lasers and quantum computing. Goldman is a professor of materials science and engineering. We sat down with her recently to learn more about her work. How is your “magic ratio” useful in solar cells? Concentrator photovoltaics will depend on the development of alloys that are safer and less expensive than those currently used in gallium arsenide semiconductors. In our earlier research, we developed alloys that use a combination of nitrogen and bismuth. Since then, we’ve been working to develop a more complete understanding of exactly how the nitrogen-bismuth combination functions, and how changing the proportion of those two elements affects the alloy’s overall properties. That research led us to the “magic ratio”—the precise proportion of bismuth to nitrogen that works best with a gallium arsenide substrate. We’ve found that by slightly tweaking that ratio within a certain range, we can control what bandwidth of light that the alloy absorbs. What’s the main hurdle standing in the way of concentrator photovoltaics? Turning “near-infrared” light into electricity is one big challenge—this is light that’s just outside the visible spectrum. A gallium arsenide solar cell consists of several thin layers of metal alloy sprayed onto a gallium arsenide substrate. It’s these thin layers that turn light into electrical charge. Each layer absorbs only a specific wavelength of light. A wavelength that slips through one layer can be caught by the next. The “magic ratio” should help researchers dial in the exact mix of an alloy to absorb whatever bandwidth of light they choose. How were you able to do what others couldn’t? We had to start by acknowledging that the conventional way of thinking about alloy composition doesn’t work for bismuth-nitrogen alloys. Making an alloy out of individual atoms is a little like filling a box with a mix of differently-sized marbles. If you know the sizes of the marbles and the size of the box, you can calculate the combination of marbles that will fill the box exactly. Researchers can calculate the composition of most alloys by using x-ray diffraction to measure the “box” and then calculating the combination of atoms that fits. That doesn’t work with bismuth and nitrogen. Bismuth is very large and nitrogen is very small, so it’s more like mixing sand and marbles. It’s hard to measure the size of a single grain of sand and even harder to predict how it will flow around all those marbles. So we worked with labs in New Mexico, Poland and Romania, as well as here at U-M, to develop a series of measurements that would each solve part of the puzzle. Then we brought them all together to precisely determine the ratio of nitrogen to bismuth in a wide range of sample alloys, and how that ratio affects light absorption properties. Where else might these kinds of alloys be useful? A better understanding of nitrogen-bismuth alloys could help us build more efficient infra-red lasers, which are widely used in fiber-optic communications and in the military. They could also be used in quantum computing, to build transistors that use the spin of electrons as a way to store information. When will the results of this research go into widespread use? There’s still a lot of progress to be made. But this research opens the door to a better understanding of exactly how these alloys work and how to make them do what we want, in solar power and elsewhere. Goldman’s most recent paper is titled “Mapping the composition-dependence of the energy bandgap of GaAsNBi alloys.” It is published in the August 23, 2019 issue of Applied Physics Letters. U-M graduate researcher Jordan Occena, T. Jen and J.W. Mitchell are also authors on the paper. An earlier, related paper is titled “Bi-enhanced N incorporation in GaAsNBi alloys.” It published in the June 15, 2017 issue of Applied Physics Letters.
<urn:uuid:2e3e2936-4619-4793-9e1f-5de106aa2f8f>
CC-MAIN-2022-33
https://news.engin.umich.edu/2019/09/the-magic-ratio-that-could-power-tomorrows-solar-cells/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882571584.72/warc/CC-MAIN-20220812045352-20220812075352-00065.warc.gz
en
0.925602
1,091
3.625
4
Table of contents: - What is a phenomenon under study? - Why do we study phenomena? - What is a phenomenon in biology? - What is anchoring effect give example? - What are the 3 dimensions of Ngss? - What is the anchoring effect in psychology? - Does anchoring really work? - How do you stop the anchoring effect? - What are the five keys to anchoring? - When should you avoid anchoring? - What is the anchoring rule in negotiation? - Who should make the first offer? - Why you should never split the difference? - Who should make the first move in a negotiation? - Why you should never accept the first offer? - How do you negotiate? - What are the 5 stages of negotiation? - How do you ask for a lower price? - How do you ask for a lower rent price? What is a phenomenon under study? A phenomenon (plural, phenomena) is a general result that has been observed reliably in systematic empirical research. In essence, it is an established answer to a research question. ... Phenomena are often given names by their discoverers or other researchers, and these names can catch on and become widely known. Why do we study phenomena? Often simple events, when looking at them through a scientific eye, can elicit curiosity and questions in students and adults. ... By having students observe and explain smaller related phenomena first, they can then be challenged to explain the larger and more complicated phenomenon. What is a phenomenon in biology? Important Biological Phenomena. Biology is the study of living organisms, including their structure, functioning, evolution, distribution and interrelationships whereas a Biological phenomenon is the series of chemical reactions or other events that result in a transformation. What is anchoring effect give example? Anchoring bias occurs when people rely too much on pre-existing information or the first information they find when making decisions. For example, if you first see a T-shirt that costs $1,200 – then see a second one that costs $100 – you're prone to see the second shirt as cheap. What are the 3 dimensions of Ngss? The term "three-dimensional learning" refers to the three pillars that support each standard, now called "performance expectations." These three dimensions are: Science and Engineering Practices, Crosscutting Concepts, and Disciplinary Core Ideas. You can use this rubric to evaluate your own curriculum for NGSS. What is the anchoring effect in psychology? The anchoring effect is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered. ... Once an anchor is set, other judgments are made by adjusting away from that anchor, and there is a bias toward interpreting other information around the anchor. Does anchoring really work? Anchoring is a powerful force, an unconscious response to information. It's not a guarantee of a win, but it is a factor to be aware of when you enter into any negotiations – or retail sales. Using it effectively, and knowing when it's being used on you, is critical in arriving at a satisfactory result. How do you stop the anchoring effect? Outsmart the biasAcknowledge the bias. Being aware of your bias is the first step. Know the weaknesses of your mind and anticipate prejudiced judgement. ... Delay your decision. The second step involves slowing your decision-making process and seeking additional information. ... Drop your own anchor. What are the five keys to anchoring? The Five Keys to Anchoring:The Intensity of the Experience.The Timing of the Anchor.The Uniqueness of the Anchor.The Replication of the Stimulus.Number of Times. When should you avoid anchoring? Important. You should never anchor in, or otherwise obstruct passage through, channels or areas such as launching ramps or any other high-traffic areas. What is the anchoring rule in negotiation? Answer: A well-known cognitive bias in negotiation, anchoring is the tendency to give too much weight to the first number put on the table and then inadequately adjust from that starting point. Who should make the first offer? Whoever makes the first offer, whether seller or buyer, is usually more effective in the negotiation. The power of first offers is strong thanks to the science of the anchor effect. Anchoring is an irrational part of human decision making—what's called a cognitive bias. Why you should never split the difference? The idea that we should approach social interactions as negotiations will feel distasteful to many. According to Voss, that is because we misunderstand what a negotiation is. ... Never Split the Difference provides the reader with a series of straightforward and actionable negotiating strategies. Who should make the first move in a negotiation? Common wisdom for negotiations says it's better to wait for your opponent to make the first offer. In fact, you may win by making the first offer yourself. Why you should never accept the first offer? Power Negotiators know that you should never say Yes to the first offer (or counter-offer) because it automatically triggers two thoughts in the other person's mind. How do you negotiate? 5 Tips for Negotiating BetterMake the first offer. One of the best negotiating strategies is to seize control of the bargaining table. ... When discussing money, use concrete numbers instead of a range. ... Only talk as much as you need to. ... Ask open-ended questions and listen carefully. ... Remember, the best-negotiated agreement lets both sides win. What are the 5 stages of negotiation? Negotiation Stages IntroductionThere are five collaborative stages of the negotiation process: Prepare, Information Exchange, Bargain, Conclude, Execute.There is no shortcut to negotiation preparation.Building trust in negotiations is key.Communication skills are critical during bargaining. How do you ask for a lower price? 5 Tips On How To Negotiate Fair Prices Without Offending The SellerBe Reasonable When Negotiating. ... If You Don't Have the Money, Don't Offer It. ... Ask For a Lower Price. ... Be Friendly. ... Don't Be Afraid to Move On.2 How do you ask for a lower rent price? How to Negotiate Your RentAsk the landlord if rent price is open to discussion. ... Highlight your strengths as a tenant. ... Inquire about extending the lease. ... Offer to end the lease in the summer. ... Research the property's value. ... Be open to compromise. ... Negotiate directly, follow up in writing. ... Have a backup plan. - Which of the following phenomena take place inside an optical Fibre? - What are some weather phenomena? - How does bronchitis differ from pneumonia? - What do we mean by embargo? - What is a wave phenomenon? - What is the phenomenon known as aurora borealis? - What is earthquake and its effects? - What type of lung disease is pneumonia? - What exactly is quantum computing? - How do you call a girl beautiful without saying it? You will be interested - What is subordinate evaluation? - What is sentence give me 5 examples? - What does such mean in English? - What are the benefits of philosophy? - How do you appraise subordinates? - How does pneumonia appear on a chest X ray? - What does Psychogenesis mean? - What is phenomenon in science? - What is the opposite of phenomena? - What is Zone phenomenon?
<urn:uuid:fe8320cd-24d8-42f9-9bbf-fc57e34024c0>
CC-MAIN-2022-33
https://psichologyanswers.com/library/lecture/read/10627-what-is-a-phenomenon-under-study
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882573849.97/warc/CC-MAIN-20220819222115-20220820012115-00265.warc.gz
en
0.925425
1,590
3.65625
4
via UNSW Sydney Quantum engineers from UNSW Sydney have removed a major obstacle that has stood in the way of quantum computers becoming a reality: they discovered a new technique they say will be capable of controlling millions of spin qubits – the basic units of information in a silicon quantum processor. Until now, quantum computer engineers and scientists have worked with a proof-of-concept model of quantum processors by demonstrating the control of only a handful of qubits. But with their latest research, published today in Science Advances, the team have found what they consider ‘the missing jigsaw piece’ in the quantum computer architecture that should enable the control of the millions of qubits needed for extraordinarily complex calculations. Dr Jarryd Pla, a faculty member in UNSW’s School of Electrical Engineering and Telecommunications says his research team wanted to crack the problem that had stumped quantum computer scientists for decades: how to control not just a few, but millions of qubits without taking up valuable space with more wiring, using more electricity, and generating more heat. “Up until this point, controlling electron spin qubits relied on us delivering microwave magnetic fields by putting a current through a wire right beside the qubit,” Dr Pla says. “This poses some real challenges if we want to scale up to the millions of qubits that a quantum computer will need to solve globally significant problems, such as the design of new vaccines. “First off, the magnetic fields drop off really quickly with distance, so we can only control those qubits closest to the wire. That means we would need to add more and more wires as we brought in more and more qubits, which would take up a lot of real estate on the chip.” And since the chip must operate at freezing cold temperatures, below -270°C, Dr Pla says introducing more wires would generate way too much heat in the chip, interfering with the reliability of the qubits. “So we come back to only being able to control a few qubits with this wire technique,” Dr Pla says. The solution to this problem involved a complete reimagining of the silicon chip structure. Rather than having thousands of control wires on the same thumbnail-sized silicon chip that also needs to contain millions of qubits, the team looked at the feasibility of generating a magnetic field from above the chip that could manipulate all of the qubits simultaneously. This idea of controlling all qubits simultaneously was first posited by quantum computing scientists back in the 1990s, but so far, nobody had worked out a practical way to do this – until now. “First we removed the wire next to the qubits and then came up with a novel way to deliver microwave-frequency magnetic control fields across the entire system. So in principle, we could deliver control fields to up to four million qubits,” says Dr Pla. Dr Pla and the team introduced a new component directly above the silicon chip – a crystal prism called a dielectric resonator. When microwaves are directed into the resonator, it focuses the wavelength of the microwaves down to a much smaller size. “The dielectric resonator shrinks the wavelength down below one millimetre, so we now have a very efficient conversion of microwave power into the magnetic field that controls the spins of all the qubits. “There are two key innovations here. The first is that we don’t have to put in a lot of power to get a strong driving field for the qubits, which crucially means we don’t generate much heat. The second is that the field is very uniform across the chip, so that millions of qubits all experience the same level of control.” Although Dr Pla and his team had developed the prototype resonator technology, they didn’t have the silicon qubits to test it on. So he spoke with his engineering colleague at UNSW, Scientia Professor Andrew Dzurak, whose team had over the past decade demonstrated the first and the most accurate quantum logic using the same silicon manufacturing technology used to make conventional computer chips. “I was completely blown away when Jarryd came to me with his new idea,” Prof. Dzurak says, “and we immediately got down to work to see how we could integrate it with the qubit chips that my team has developed. “We put two of our best PhD students on the project, Ensar Vahapoglu from my team, and James Slack-Smith from Jarryd’s. “We were overjoyed when the experiment proved successful. This problem of how to control millions of qubits had been worrying me for a long time, since it was a major roadblock to building a full-scale quantum computer.” Once only dreamt about in the 1980s, quantum computers using thousands of qubits to solve problems of commercial significance may now be less than a decade away. Beyond that, they are expected to bring new firepower to solving global challenges and developing new technologies because of their ability to model extraordinarily complex systems. Climate change, drug and vaccine design, code decryption and artificial intelligence all stand to benefit from quantum computing technology. Next up, the team plans to use this new technology to simplify the design of near-term silicon quantum processors. “Removing the on-chip control wire frees up space for additional qubits and all of the other electronics required to build a quantum processor. It makes the task of going to the next step of producing devices with some tens of qubits much simpler,” says Prof. Dzurak. “While there are engineering challenges to resolve before processors with a million qubits can be made, we are excited by the fact that we now have a way to control them,” says Dr Pla. More from: University of New South Wales The Latest Updates from Bing News & Google News Go deeper with Bing News on: Silicon quantum processor - MIT’s New Analog Synapse Is 1 Million Times Faster Than the Synapses in the Human Brain New Hardware Delivers Faster Computation for Artificial Intelligence, With Much Less Energy MIT engineers working on “analog deep learning” have found a way to propel protons through solids at ... - New hardware offers faster computation for artificial intelligence, with much less energy As scientists push the boundaries of machine learning, the amount of time, energy, and money required to train increasingly complex neural network models is skyrocketing. A new area of artificial ... - New quantum encryption method could lead to truly secure communication Researchers said their method of quantum encryption could lead to secure communication that is 'fundamentally beyond' an adversary's control. - US-Irish partnership gets €3m to lay foundations of the quantum internet Researchers in Ireland, Northern Ireland and the US aim to link quantum computers together over a quantum internet to boost their power. - Best of Last Week—New phase of matter, a replacement for silicon, a better vaccine for omicron subvariants It was a good week for physics as a team with members affiliated with several institutions in the U.S. created a strange new phase of matter in a quantum computer that acted like it had two dimensions ... Go deeper with Google Headlines on: Silicon quantum processor Go deeper with Bing News on: - Why Israel is moving to quantum computing The Israel Innovation Authority (IIA) has selected Quantum Machines to establish its national Quantum Computing Center. - Bosch’s new partnership aims to explore quantum digital twins Bosch's partnership with Multiverse Computing is an example of how many legacy companies are exploring quantum computing today to prepare for more capable hardware. - HCL Technologies to support Sydney-based universities on quantum computing research and development HCL Technologies has signed a memorandum of understanding with four Sydney universities on how the Indian multinational IT services and consulting company will support quantum computing research and ... - Inca Knots Inspire Quantum Computer We think of data storage as a modern problem, but even ancient civilizations kept records. While much of the world used stone tablets or other media that didn’t survive the centuries, the ... - New method of controlling qubits could advance quantum computers Quantum computing, a field that relies on the principles of quantum mechanics to calculate outcomes, has the potential to perform tasks too complex for traditional computers and to do so at high ...
<urn:uuid:ce14f761-c3be-40ec-b316-12d585b94c9d>
CC-MAIN-2022-33
https://innovationtoronto.com/2021/08/removed-a-major-obstacle-that-has-stood-in-the-way-of-quantum-computers-becoming-a-reality/
s3://commoncrawl/crawl-data/CC-MAIN-2022-33/segments/1659882572908.71/warc/CC-MAIN-20220817122626-20220817152626-00466.warc.gz
en
0.936086
1,785
3.546875
4
At SAND14 in San Jose, California, Quantum Physicist John Hagelin spoke about the theory that entangled particles are connected through wormholes – the rabbit hole of physics, providing a physical link to enable instantaneous tunneling through space. Let’s take a closer look at that perspective. A wormhole, officially known as an Einstein–Rosen bridge, is any structure connecting two regions or areas otherwise distant or unrelated. It is a hypothetical topological feature of spacetime, fundamentally a shortcut through spacetime. A wormhole is much like a tunnel with two ends, each with separate points in spacetime. In principle, two widely separated black holes can be connected to each other and look like trumpet horns making shortcut through spacetime. Try to visualize space as a two-dimensional (2D) surface. In this way, a wormhole can be pictured as a surface that leads into a 3D tube (the inside surface of a cylinder). The tube then re-emerges at another location on the 2D surface with a similar entrance hole. The actual wormhole would be equivalent to this, but with the spatial dimensions plus one. For example, instead of circular holes in a 2D plane, the two wormhole`s mouths could actually be spheres in 3D space. Wormholes have long been discussed as a possible mode of interstellar travel and even of time travel. The recently released movie Interstellar is greatly inspired by this phenomenon. They are also fairly well-popularized by science fiction, especially Star Trek: Deep Space Nine, which depicts a large traversible wormhole that allows the characters to travel from familiar regions of space to a distant and unrelated area on the other side of the galaxy. Wormholes and quantum entanglement Two quantum-entangled particles always instantly adopt correlated values, no matter how much distance separates them. If a quantum-entangled pair is depicted as a pair of twins, as one twin is raising the right hand, the other invariably and simultaneously raises the left hand. Being able to explain this phenomenon through the wormhole connection, reduces the spookiness Einstein referred to when he talked about entanglement. Few theoretical physicists have imagined a connection between the concept of entanglement and that of a wormhole, and a hypothetical connection between black holes that serves as a shortcut through space. Juan Maldacena, theorist at the Institute for Advanced Study in Princeton, New Jersey, and Leonard Susskind, theorist at Stanford University in Palo Alto, California, have observed the entanglement of quantum states of two black holes - and pulling the black holes apart. When that happens, they argued, a bona fide wormhole forms between the two black holes. According to Maldacena and Susskind, it could also be possible to create a wormhole connection between two ordinary quantum particles such as quarks that make up protons and neutrons. Kristan Jensen of the University of Victoria in Canada and Andreas Karch of the University of Washington, Seattle assume that the 3D space where the quarks reside is a hypothetical boundary of 4D world. In this 3D space, the entangled pair is connected with a kind of conceptual string. But in the 4D space, the string becomes a wormhole. Julian Sonner of the Massachusetts Institute of Technology in Cambridge builds upon Karch’s and Jensen’s work. He observed that a quark-antiquark pair popping up produces a strong electric field which then sends it to the oppositely charged particles accelerating in opposite directions. Sonner also found that the entangled particles in the 3D world are connected with wormholes in the 4D world. To arrive at this result, Jensen, Karch and Sonner use the so-called holographic principle, a concept invented by Maldacena stating that a quantum theory with gravity in a given space is equivalent to a quantum theory without gravity in a space with one less dimension that makes up the original space’s boundary. In other words, black holes inside 4D space and a wormhole between them are mathematically equivalent to their holographic projections existing on the boundary in 3D. These projections are essentially elementary particles that function according to the laws of quantum mechanics, without gravity and a string connecting them. The wormhole and the entangled pair don’t live in the same space, but mathematically they are equivalent. Susskind and Maldacena argued that the original quantum particles reside in a space without gravity. In a simplified gravity-free 3D model of our world, there can’t be any black holes or wormholes. Susskind adds that the connection between a wormhole and entanglement in a higher dimensional space is a mere mathematical analogy. The wormhole and entanglement equivalence only makes sense in a theory with gravity. However, Karch and his colleagues said that their calculations are an important first step toward verifying Maldacena and Susskind’s theory. Their toy model without gravity gives a concrete realization of the idea that wormhole geometry and entanglement can be different manifestations of the same physical reality. Promoting his new book 'The Myth of Normal' on the Tim Ferris Show Please enter your email and we’ll send you instructions to reset your password
<urn:uuid:bdc28f6d-f8c3-49e2-83cb-4e957592916e>
CC-MAIN-2023-06
https://www.scienceandnonduality.com/article/quantum-entanglement-and-wormholes
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500044.16/warc/CC-MAIN-20230203055519-20230203085519-00395.warc.gz
en
0.910381
1,085
3.5625
4
A Bose-Einstein condensate is a state of matter in which atoms lose their individual identities and instead behave as a single entity. In this state, all of the atoms have the same wavelength, meaning they vibrate in unison. Bose-Einstein condensates are incredibly difficult to create and study, but they could potentially revolutionize computing. Everywhere we look around ourselves, we see matter. The device you’re reading this article on, the air we breathe, along with all life on Earth is made up of matter. We can safely say that matter is everything composed of atoms. The reason we see matter taking so many different forms is because it exists in many different states. Generally, matter exists in a certain amount of states at classical conditions, but when subjected to extreme conditions, matter is found to behave in different states altogether. One such state of matter, found at extremely critical conditions, was discovered by two legendary scientists, Satyendra Nath Bose and Albert Einstein. This state of matter was therefore given the name Bose-Einstein Condensate. First, however, to understand Bose-Einstein Condensate, we must look at the classical states of matter, refreshing how how atoms behave in them and how matter flows from one state to another. Recommended Video for you: The Change Of States Of Matter Matter has many states in which it can exist. The state of matter depends on the interaction of atoms between one another, as well as the energy levels of every atom as a whole. Matter can change from one state to another when subjected to different temperatures and pressures. Under classical physical conditions, matter can exist in four states: The best example for depicting changes in states of matter is water. Below 0°C, water exists in its solid state—ice. Upon heating ice above 0°C at standard pressure, it gets converted into liquid water. Upon heating liquid water above 100°C at standard pressure, we obtain steam, which is the gaseous form of water. Steam, when it undergoes the process of ionization, which adds or removes an electron to create ions, generates the plasma state of water. The energy of atoms is the governing body to determine in which state of matter a substance is found. When we impart heat to atoms, we basically give them energy. That energy is absorbed by the atoms as they begin convert this energy into motion. This is essentially what we see during such change of states of matter. Atoms in solids have very little energy and vibrate with low amplitudes, which is why solids stay in one place. When we heat solids, we impart them with energy. The atoms then begin vibrating with more energy and higher amplitudes. This is when we obtain liquids and gases, both of which have a tendency to flow, rather than remain stagnant. However, when we talk about Bose-Einstein Condensate, we are not talking about standard terms of physical conditions. Bose-Einstein Condensates are generally made in temperatures that are millions of times colder than space itself. Thus, to get a better understanding of the Bose-Einstein Condensate, we must go into the quantum physics of an atom. A Dive Into The Quantum Realm Quantum Physics is the branch of physics dealing with subatomic particles and all matter and energy at the smallest scales. Quantum Physics also describes the laws governing an atom. In 1924, Louis-Victor de Broglie claimed that all matter had a wave-like nature. This actually laid the basis for Quantum Physics. What this meant was that all matter could exist like both a particle and a wave at the same time! The reason why we don’t see this wave particle duality very often is because the mass of all objects around us has millions of millions of million more mass than the subatomic particles quantum physics deals with. In short, the objects around us have so much mass that their wave nature is almost invisible, but in small objects like electrons, we see this phenomenon more plainly. Quantum physics also states that each atom has its own identity. Each atom has its own unique wavelength (since it behaves like a wave) and has its own individuality as a particle. We’re able to distinguish one atom from another due to certain qualities, similar to how we can distinguish between two human beings. We must keep these laws in mind when talking about Bose-Einstein Condensate. Turning The Microscope On The Bose-Einstein Condensate Most of us know that there is no temperature lower than Absolute Zero, which is -273 °C or 0 K. Absolute Zero is that temperature at which atoms have no energy and cease motion entirely. So, what happens when you cool a gas with low density to temperatures only a fraction above Absolute Zero? Well, the answer to this question is… the Bose-Einstein Condensate! It was found that upon cooling matter at temperatures just a whisker above 0 K, the material enters another state of matter, suitably named Bose-Einstein Condensate. We already know that when atoms are cooled to lower temperatures, they have lower energy levels. Thus, in the Bose-Einstein Condensate state, atoms have near-zero energy levels. Remember the wave-particle duality of atoms covered in Quantum Physics? In a Bose-Einstein Condensate, all the atoms of a substance begin to exhibit a similar wavelength. These wavelengths then begin to overlap. At this point, the atoms undergo an identity crisis. Instead of having multiple different atoms exhibiting different wavelengths, we observe a single atom exhibiting a single wavelength. One atom cannot distinguish itself from another, so we consider the aforementioned single atom to be a “super atom”. To put this very simply, the Bose-Einstein Condensate (BEC) is that state of matter where all the atoms of a particle begin to act as a single atom called a Super Atom. Unlike all the other states of matter, in the BEC, all the atoms vibrate in unison, that is, they all vibrate with the same wavelength with the same time period. This phenomenon could allow the BEC to revolutionize computation, making the realization of quantum computing possible. This concept is immensely tough to grasp and there is still a great deal of research going on related to it, but the BEC could open new and incredible doors of achievement in the world of physics.
<urn:uuid:eced4d04-2038-4482-b68b-fc46b07ae112>
CC-MAIN-2023-06
https://test.scienceabc.com/pure-sciences/bose-einstein-condensate.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500095.4/warc/CC-MAIN-20230204075436-20230204105436-00354.warc.gz
en
0.943873
1,353
3.53125
4
Artificial Intelligence (AI) helps machines to learn from experience, adapt to new inputs, and perform human-like tasks. Machine Learning (ML) is a subset of AI that enables software programs to grow increasingly effectively, predicting outcomes without explicitly programming them. The ML algorithm anticipates new output values with preliminary data as input and the future of Software Development. Meaning of AI and Machine Learning The capacity of a digital computer or computer-controlled robot to accomplish activities often associated with intelligent individuals is called Artificial Intelligence (AI). The phrase endeavors to produce systems with human-like cognitive processes, such as the ability to reason, discover meaning, generalize, or learn from prior experience. AI refers to machine intelligence instead of human intelligence. Although no AI can accomplish the full range of jobs that an ordinary person can, specific AIs can match humans in specialized skills. Machine Learning is a branch of artificial intelligence, which is the capability of a machine to replicate intelligent human behavior. AI systems simplify complicated tasks comparable to how people solve issues. ML is a modern breakthrough that has improved a wide range of industrial and professional procedures and our daily lives. AI is a subfield focusing on developing intelligent computer systems that can learn from accessible databases using statistical approaches. Software development predictions for the future: The future of software development is already here. And Software Development may be seen in the current patterns software development teams use. - Innovation Will Spread - Applications will become smaller, and hardware will become obsolete - Quantum Computing Will Change Everything - Software Will Be Proactive - User Experience Will (Still) Be Number One 7 Stages of Machine Learning Machine learning is used in software development to increase software accuracy and dependability by employing algorithms that recognize patterns, categorize data, and generate predictions. It aids in finding code mistakes that might lead to bugs and other issues. - Collecting Data Machines, as you may know, first learn from the data you provide them. So, at this point, we are gathering data to train the model. - Preparing the Data You must arrange your data after you receive it. The most fundamental component is the cleaning and changing it so we may use it. - Choosing a Model The first step in every machine learning project is to decide which model to utilize. Simple linear regression models to more complicated deep learning models are available. - Training the Model In this stage, we will train our model using labeled data and test it with new unlabeled data. We can also do feature engineering like discretization or dimensionality reduction here for accurate predictions. - Evaluating the Model In this stage, we compare our predictions to the actual data to see whether or not our model is correct. - Parameter Tuning Parameter Tuning is one of the essential tasks in machine learning because if your parameters are appropriately tuned, your model will be valuable, if not worse! - Making Predictions Forecasting about the future at this time is done. We employ a learning system trained on data with predetermined outputs. Positive Changes that Machine Learning can bring to Software Development The following are a few Positive Changes that Machine Learning can bring to Software Development: - Detect Deviation from Coding Guidelines ML in real-world applications helps to speed the anomaly detection process and save resources. It can occur not just after the fact but also in actual time. Real-time anomaly detection uses to increase security and resilience in fraud and cyber security areas. - Obtain Code-Based Insights - ML may give various essential insights, such as: - How much legacy code do you have in your IT portfolio? - Do you have any unmaintained code? - How many apps do you have that need to be cloud-ready? - Uncontainerized app percentage - What is slowing down your development? - How frequently do you repurpose code in your organization? - Who are your top-performing programmers? - How well does your team work together? - What vital talents does your team lack? - Machine Learning can help you with coding, code review, and testing As a senior executive in a corporate IT division, you know that application development, code review, and testing are all manual, repetitive chores. On the other hand, ML provides a new generation of automation that goes well beyond the rule-based automation you have previously seen. - Enhance Data Management ML Models Function Successfully on Huge Data Machine learning models work effectively on big data, where they can learn a fantastic range of patterns and trends. Assuring quicker reaction time and reduced memory usage becomes more difficult for data science specialists. Data integration from numerous sources is easier with ML than with classical data indexing. Furthermore, machine learning aids in data infrastructure administration, allowing data engineers to manage data pipelines more effectively. The Future of Software Development ML is used in software development to increase software accuracy and dependability by employing algorithms that recognize patterns, categorize data, and generate predictions. It aids in finding code mistakes that might lead to bugs and other issues. ML is also used to forecast the future of software development by the occurrences based on past user behavior or data. The process of employing ML algorithms to improve software quality is known as machine learning development. In other words, it is a method of automatically identifying and repairing mistakes in your code, allowing it to function more smoothly and satisfy better standards. AI may increase human creativity, liberate humans from complex or pointless duties, and even replace humans in risky positions. Even with this, the future of software development is still possible because AI will not replace developers or programmers anytime soon. However, it may undertake code and creating activities in the future. The advancement of AI technology will work for hand in hand with the digitalization and intelligent upgrading of the sector, resulting in a bright future of software development with limitless potential. Artificial intelligence is the most significant achievement in the realm of software development. Because of its superior neural algorithms, AI-assisted automation minimizes manual participation, reduces complexity, and can handle real-world processes.
<urn:uuid:69055bb7-919e-4d59-858c-066db274309e>
CC-MAIN-2023-06
https://blog.jydigitek.com/ai-machine-learning-and-the-future-of-software-development/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500058.1/warc/CC-MAIN-20230203154140-20230203184140-00756.warc.gz
en
0.91371
1,248
3.78125
4
Steven Galbraith once told me that he expects mathematicians to teach RSA long after the world has migrated to post-quantum algorithms; because it is so easy to explain. Arguably, LWE is easier to explain than RSA but the Approximate Greatest Common Divisors problem (AGCD) is even easier than that and requires only scalars. Thus, it is a nice post-quantum alternative for an undergraduate mathematics module. Someone should perhaps write an undergraduate mathematics textbook introducing cryptography using Approximate Common Divisors. To set a baseline, let’s start with recalling naive RSA. - KeyGen. The public key is and the private key is , with - where and prime, - coprime to and - such that . This naive version of RSA only achieves a basic form of security — OW-CPA — even against classical adversaries: it is hard to recover random messages when eavesdropping. Kids, always implement RSA-OAEP. It is easy to see that an adversary that can factor large integers can break RSA: knowing and permits to compute which permits to compute . (It should be noted, though, that this does not mean an adversary has to factor to solve RSA.) The best known classical algorithm for factoring is the Number Field Sieve (NFS). It has a super-polynomial but sub-exponential complexity of operations. On the other hand, and this is the reason why we care about post-quantum cryptography, an adversary with access to a quantum computer with gates can factor using Shor’s algorithm. Greatest Common Divisors Now, to pivot to GCDs, what if two or more users generate moduli and , i.e. moduli with shared factors? We assume that factoring each of or is hard, but computing , i.e. the largest integer dividing both and , reveals (or a small multiple). We can compute greatest common divisors using the Euclidean algorithm: def gcd(a, b): if b == 0: return a else: return gcd(b, a % b) Approximate Greatest Common Divisors Thus, computing GCDs can break RSA with poor randomness. On the other hand, adding a bit of noise to the problem – going from Greatest Common Divisors to Approximate Greatest Common Divisors – makes the problem (for all we know) hard, even on a quantum computer. The Approximate GCD problem is the problem of distinguishing from uniform with (, and are secret). For the problem to be hard, we require , and . We can build public-key encryption from the AGCD problem as follows: - KeyGen. The public key is a bunch of AGCD samples where the errors are multiples of 2, i.e. and the private key is . It can be shown that all errors being multiples of two does not weaken security. - Enc. For output with , i.e. do a random zero-one combination of the samples in the public key and add . This effectively samples a new AGCD sample and adds . - Dec. , i.e. take the ciphertext mod which produces and the take that mod 2 to recover . If the AGCD problem is hard then this encryption scheme is IND-CPA secure. That’s better than merely OW-CPA but to achieve security against active attacks we would need to apply a generic transform. How would we attempt to solve the AGCD problem? Following the mantra I first heard from Alexander May – first you try exhaustive search, then you try a time-memory trade-off, then you think – let’s start with exhaustive search. Given and we know that and we can simply guess and which costs GCD computations. Thus, under this attack would get way with smaller but there is time-memory trade-off. The basic idea is the realisation that we can replace GCDs by multiplications, if or then we have and . That is, we can compute for all guesses with . The cost of this is GCD computations (yay!), multiplications (boo!), so it does not give us much of a saving. Yet, this can be extended to a time-memory trade-off which recovers with overwhelming probability in time . This is why we require . Finally, a lattice attack. Given and , consider and note that . So there is a linear combination of and that produces something small. This is all nice and well, but we don’t know which to pick! Still, let’s generalise this observation and write it down in matrix form. As before, multiplying on the left by the vector gives which is a vector with small coefficients compared to the . The set of all integer-linear combinations of the rows of matrix is called the lattice spanned by (the rows of) that matrix. Finding short vectors in lattices is assumed to be hard, even on a quantum computer. While the above only sketches that we can break AGCD if we can find short vectors (similar to RSA and factoring), it is also possible to show that if you can solve the AGCD problem then we can also find short vectors in lattices (in contrast to RSA and factoring!). That is, if there is an algorithm efficiently solving the AGCD problem then there exists an algorithm which solves the Learning with Errors problem with essentially the same performance. Then, second step, if there is an algorithm efficiently solving the LWE problem then there exists a quantum algorithm which solves worst-case SIVP instances, i.e. finds short vectors in arbitrary lattices. PS: Homomorphic encryption Given with , we can compute to get and . We can also compute to get . That is, we can compute XOR which suffice to build any gate. Thus, we can compute on encrypted data.
<urn:uuid:dba40c51-2f9f-409f-83a1-bacb616a0c72>
CC-MAIN-2023-06
https://martinralbrecht.wordpress.com/2020/03/21/the-approximate-gcd-problem/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764501555.34/warc/CC-MAIN-20230209081052-20230209111052-00796.warc.gz
en
0.922593
1,262
3.640625
4
Dieser Artikel ist älter als ein Jahr! Earlier this year, the University of Vienna’s Quantum Science and Technology department published their findings regarding a highly secure blind computation process that combines the power of quantum computing and quantum cryptography. In addition to being the world’s first demonstration of this theory, the team notes that this process can one day have huge implications on internet security, particularly in the growing field of quantum cloud computing. And while this experiment is remarkable in it’s own right, I’m not really sure how many people outside the realm of quantum physicists, truly understand what and how this experiment has a real world application. To this end, I recently sat down with Stefanie Barz, team lead on the experiment to try to put things into a real world perspective. They use laser beams To begin, Stefanie provided me with an exclusive view of the quantum computer used to perform the experiment, including the laser array (Yes, they use laser beams!) used to produce the photons that will eventually be used to carry the data. To be clear, this laser array is a component used to produce the necessary photons that are then sent to the quantum computer. She explained that this array is responsible for entangling photons, meaning that they are in a certain state whereby they share a complex connection to each other. Researchers then measure one state of the photon, thus changing the state of said photon and affecting the state of the second photon. Physicists refer to this connection as super correlation, with Einstein referring to the process as “spooky action at a distance,” or “spukhafte Fernwirkung”. It’s precisely this interaction between particles that makes quantum computing far more efficient from the classical process we’re all familiar with. By capitalizing on the ability of quantum particles to be in more than one state at the same time, this allows the computer to perform any number of possible solutions to a given problem simultaneously. From green to red To create these entangled photons, Stefanie and her team use the laser beam to convert the wavelength from green to red, and then on to a blue beam. From this blue beam, the entangled photons are then routed through optical fibers and sent to the quantum computer to be further processed into a cluster state. The cluster state is then used to create the “blind qubit” that will then go on to be measured by the quantum computer. Granted, that might seem like a whole lot of work (and power – did I mention lasers?) just to create few photons, and thus qubits, but keep in mind that we’re not talking about sending a typical email here. What Stefanie and her team have done is create an absolutely secure form of data processing that cannot be intercepted and understood. In addition to the ultra-secure method of transmission and encryption, the end computer is also unable to detect what it is actually processing. Now before you start clamoring for your very own quantum computer to send completely secure emails, keep in mind that these devices are still in their infancy. A practical, real-world quantum computer is still far off, as the one I viewed consumed an entire room, and performed only a simple, yet highly effective computation. Expensive & rare If and when quantum computers do reach a practical level, it’s a fair statement to make that they’ll be quite expensive, and very rare. Enter cloud computing. With the usage of cloud computing growing on a daily rate, instead of needing their own quantum computer, researchers, (evil?) scientists, and others from around the world could theoretically rent or purchase computational time on said devices. Obviously, if you’re in need of the services of a quantum computer, there’s a good chance that you’d really rather not have others knowing exactly what you’re working on. Thus the need for the blind computation, and absolutely secure data processing. Quantum computers do contain entangled qubits, therefor; simply generating and sending qubits isn’t going to solve this security issue fully. What Stefanie and her team have done is add an additional layer of security to this already confounding method of data transfer. The random code The trick here is a series of what appears to be random bits of code, but is in fact pre-encrypted by the sender. This “random” series of data is a form of photon polarization (vertical or horizontal), and remains encrypted throughout the calculation, and is still able to be processed by the quantum computer (although the computer has no idea what it is processing). However, if an eavesdropper where to intercept the data anywhere along the transmission path, they would have no way of knowing exactly how to put the encryption (polarization sequence) together to make any sense of the data. Having created the original encryption, the end receiver can then interpret the results, resulting in an absolutely secure form of data processing. Two quantum algorithms in test In the demonstration conducted at the University of Vienna, Barz and her team tested two quantum algorithms; Deutsch’s, which detects certain regularities in mathematical functions, and Grovers’, which can search an unsorted database (think phone book). They created the above-mentioned “spooky action at a distance” state of photons, and encrypted their data transmission. Having received the photons and created an entangled cluster state, the quantum computer then carried on and began solving the problem. However, because of this extra layer of polarized encryption, there was no way to determine exactly what the computer was doing and/or processing, thus proving the security of their test. The team had to wait until the results were returned to discover if the entire process had actually worked. It’s also worth noting that this level of security is a two-way street. Meaning, those that are responsible for, or even own, a quantum computer are most likely quite protective of their asset. By providing this blind computational process, the sender of the data would have no way of peering into the inner workings of the computer processing their request, and of course, vice versa. Who needs a quantum computer? You and I are probably not going to have any need for a quantum computer in the near future, nor will we be sending data that could cause unrest in certain parts of the world, (sorry, pr0n doesn’t count). With that said, while Stefanie denies any contact, I can’t help but wondering if any government or military organizations have been in touch, as this is the absolute perfect application for such computational power and data transmission. Yes, the blind computational demonstration is a slightly-over-the-top form of secrecy, but in today’s world, there’s a perfect German expression, “Sicher ist Sicher.” (Better safe than sorry). PhD candidate Stefanie Barz of Vienna’s Quantum Science and Technology department
<urn:uuid:b14b2ae6-b214-4ad9-a73c-2132fe4f46af>
CC-MAIN-2023-06
https://futurezone.at/english/ultra-secure-quantum-computing-explained/24.577.063
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499713.50/warc/CC-MAIN-20230129112153-20230129142153-00197.warc.gz
en
0.941099
1,454
3.59375
4
Richard Feynman was an American physicist who contributed significantly to developing quantum mechanics and quantum computing. Feynman was born in 1918 in New York City and received his PhD in physics from Princeton University in 1942. He is well known for his work in quantum electrodynamics (QED), which he developed in the 1940s and 1950s but crucially also for his work towards the ideas of Quantum Computing and even Nanotechnology. Richard Feynman was born on May 11, 1918, in New York City, United States. Richard Feynman was a physicist who significantly contributed to the development of quantum mechanics and quantum computing. Feynman is best known for his work in quantum electrodynamics (QED), which he developed in the 1940s and 1950s. The connection to Quantum Computing In 1982, Feynman gave a lecture at the MIT Computer Science and Artificial Intelligence Laboratory. He proposed using quantum mechanical phenomena to perform calculations that would be impractical or impossible using classical computers. This idea was later developed into the field of quantum computing. Feynman’s ideas and work continue to influence the field of quantum computing, and he is often considered one of the founders of the entire field. Feynman’s idea was to build a controllable quantum environment and use it for analogue quantum computations of things that are hard to simulate. Simulating complex systems often becomes an issue for systems which involve many-body interactions. Simulating a single electron is relatively simple. It is in either state A or state B. We may even say spin-up or spin-down to denote the two possibilities. With two electrons, you have the possibility of having both in state A, both in B, one in A and the other B, or vice versa – a total of four probabilities (AA, AB, BA, BB). With ten electrons, this rises to 1,024 probabilities (2 to the power 10), and 20 electrons have 1,048,576 combinations. Consider how a potential drug molecule binds to a receptor, and you have some idea about the complexity and possible combinations that must be considered. Simulating, therefore, by conventional means becomes “computationally expensive” when you have potentially hundreds of atoms and, consequently, thousands of electrons. But the systems that a scientist wants to investigate often have millions of electrons, and the number of probabilities becomes unworkable. In the late 1970s, Feynman began considering this problem. In a paper published in 1982, “Simulating Physics with Computers“, he postulated that to simulate quantum systems, you would need to build quantum computers. This may seem a bit of a misnomer, that quantum simulations require physical quantum systems, but this is now considered one of the very purposes of quantum computers where there are not enough accessible states available in conventional classical simulations. Richard Feynman was also interested in using quantum mechanics to perform calculations. In 1982, he gave a lecture at the MIT Computer Science and Artificial Intelligence Laboratory in which he proposed the idea of using quantum mechanical phenomena to perform calculations that would be impractical or impossible using classical computers. This idea was later developed into the field of quantum computing. Feynman received the Nobel Prize in Physics in 1965 for his contributions to developing QED (Quantum Electro Dynamics). He died in 1988, but his ideas and work continue to influence the field of physics and the development of quantum computing. The connection to Nanotechnology In 1959, Feynman gave a lecture at the American Physical Society meeting in which he discussed the possibility of manipulating and arranging individual atoms and molecules to create new materials and devices. This lecture, known as the “There’s Plenty of Room at the Bottom” speech, is often considered the starting point for nanotechnology. Nanotechnology is the study and application of tiny things and manipulation of individual atoms and molecules. It involves creating and using materials, devices, and systems with unique properties because of their small size. According to the National Nanotechnology Initiative, Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers. Nanotechnology or Nanotech has the potential to revolutionize many fields, including electronics, medicine, energy production, and materials science. For example, nanotechnology could create more powerful and efficient computer processors, develop new and more effective drugs, and create more robust and lighter materials (nanomaterials). Some have even postulated nanomachines that can get inside the human cell and fix and repair it as needed. To give the full title: “There’s Plenty of Room at the Bottom: An Invitation to Enter a New Field of Physics” was a lecture given by physicist Richard Feynman at the annual American Physical Society meeting at Caltech on December 29, 1959. We can even link nanotechnology and quantum computing, for the techniques that researchers use to develop the latest qubits (the devices that do the computing) are often nanoscale and require fabrication at ever tinier length scales. Just as traditional transistors in microprocessors developed by the likes of Intel, AMD and NVIDIA are getting smaller, so too are the qubits (the quantum analogue of the transistor). For example, quantum dots can be used as semiconducting qubits, where the spin of electrons is used as the “switching unit” and can be operated upon. In 1975, Richard Feynman and his wife, Gweneth Howarth, purchased a Dodge Tradesman Maxivan. They had it decorated with Feynman diagrams, which are symbols that Feynman himself had created to depict complex particle interactions through simple lines and loops. While it might appear arrogant to display one’s intellectual accomplishments in this way, Feynman’s daughter Michelle believes that the decorations on the van reflected Feynman’s passion for physics. The van still exists and has been lovingly restored. The Manhattan Project Richard Feynman was a member of the team of scientists who worked on the Manhattan Project, which was the Allied effort during World War II to develop the first nuclear weapons. Feynman was a relatively junior (at age 24) member of the team, but he made significant contributions to the project, particularly in developing the first successful atomic bomb. When he was at Princeton, Feynman was recruited for the theoretical division of the Manhattan Project. Feynman was present at the first detonation of the atomic bomb. It is thought that radiation exposure may have contributed to his death at 69 from abdominal cancer. There is no question that Richard Feynman inspired generations of scientists, even now, almost four decades after his death. He leaves a legacy of radical thought and new innovative ways to think about problems and has impacted two significant fields: nanotechnology and quantum physics. If you want to read more about his work, I can suggest the following books Richard Feynman has authored:
<urn:uuid:3486fdac-f9e7-4d77-8cbc-39b1f3b87c8c>
CC-MAIN-2023-06
https://quantumzeitgeist.com/richard-feynman-and-his-contributions-to-quantum-computing-and-nanotechnology/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764494852.95/warc/CC-MAIN-20230127001911-20230127031911-00037.warc.gz
en
0.961565
1,439
3.859375
4
In quantum teleportation, the properties of quantum entanglement are used to send a spin state (qubit) between observers without physically moving the involved particle. The particles themselves are not really teleported, but the state of one particle is destroyed on one side and extracted on the other side, so the information that the state encodes is communicated. The process is not instantaneous, because information must be communicated classically between observers as part of the process. The usefulness of quantum teleportation lies in its ability to send quantum information arbitrarily far distances without exposing quantum states to thermal decoherence from the environment or other adverse effects. Although quantum teleportation can in principle be used to actually teleport macroscopic objects (in the sense that two objects in exactly the same quantum state are identical), the number of entangled states necessary to accomplish this is well outside anything physically achievable, since maintaining such a massive number of entangled states without decohering is a difficult problem. Quantum teleportation, is, however, vital to the operation of quantum computers, in which manipulation of quantum information is of paramount importance. Quantum teleportation may eventually assist in the development of a "quantum internet" that would function by transporting information between local quantum computers using quantum teleportation . Below is a sketch of an algorithm for teleporting quantum information. Suppose Alice has state C, which she wants to send to Bob. To achieve this, Alice and Bob should follow the sequence of steps: 1) Generate an entangled pair of electrons with spin states A and B, in a particular Bell state: Separate the entangled electrons, sending A to Alice and B to Bob. 2) Alice measures the "Bell state" (described below) of A and C, entangling A and C. 3) Alice sends the result of her measurement to Bob via some classical method of communication. 4) Bob measures the spin of state B along an axis determined by Alice's measurement Since step 3 involves communicating via some classical method, the information in the entangled state must respect causality. Relativity is not violated because the information cannot be communicated faster than the classical communication in step 3 can be performed, which is sub-lightspeed. The idea of quantum teleportation, which can be seen in the mathematics below, is that Alice's measurement disentangles A and B and entangles A and C. Depending on what particular entangled state Alice sees, Bob will know exactly how B was disentangled, and can manipulate B to take the state that C had originally. Thus the state C was "teleported" from Alice to Bob, who now has a state that looks identical to how C originally looked. It is important to note that state C is not preserved in the processes: the no-cloning and no-deletion theorems of quantum mechanics prevent quantum information from being perfectly replicated or destroyed. Bob receives a state that looks like C did originally, but Alice no longer has the original state C in the end, since it is now in an entangled state with A. Which of the following is true of quantum teleportation? 1) Quantum information is transferred between states 2) The teleported particle is physically transferred between locations 3) A quantum state is cloned between observers 4) Quantum information is permanently removed from the system As a review, recall the Pauli matrices: The spin operators along each axis are defined as times each of for the axes respectively. These Pauli matrices are used to construct Bell states, an orthonormal basis of entangled states for the tensor product space of spin- particles: Measurements that project tensor products of spin states onto the Bell basis are called Bell measurements. Now, follow the algorithm sketched in the previous section. Suppose Alice starts with state C, which she wants to send Bob. State C can be written in the most general form: with and normalized complex constants. 1) Generate an entangled pair of electrons A and B in the Bell state: The state of the full system of three particles is therefore . This is a product state between entangled pair AB and non-entangled C. 2) Alice measures the Bell state of AC, entangling A and C while disentangling B. The process of measuring the Bell state projects a non-entangled state into an entangled state, since all four Bell states are entangled. Expanding Alice's full original state, she starts with: Multiplying out the states and changing to the Bell basis of A and C, this state can be rewritten: When Alice measures the Bell state of A and C, she will find one of , each with probability . Whichever she measures, the state of particle B will be after measurement. 3) To send Bob the state of particle C, therefore, Alice does not need to send Bob the possibly infinite amount of information contained in the coefficients and which may be real numbers out to arbitrary precision. She needs only to send the integer of the Bell state of A and C, which is a maximum of two bits of information. Alice can send this information to Bob in whatever classical way she likes. 4) Bob receives the integer from Alice that labels the Bell state that she measured. After Alice's measurement, the overall state of the system is: Bob therefore applies to the disentangled state on his end, by measuring the spin along axis . Since for all , Bob is left with the overall state: Bob has therefore changed the spin state of particle B to: which is identical to the original state of particle C that Alice wanted to send. The information in state C has been "teleported" to Bob's state: the final spin state of B looks like C's original state. Note, however, that the particles involved never change between observers: Alice always has A and C, and Bob always has B. - Pirandola, S., & Braunstein, S. Physics: Unite to build a quantum Internet. Retrieved from http://www.nature.com/news/physics-unite-to-build-a-quantum-internet-1.19716 - Debenben, . quantum teleportation diagram. Retrieved from https://commons.wikimedia.org/w/index.php?curid=34503176
<urn:uuid:249ffbdd-8203-4071-b608-17084f133dc5>
CC-MAIN-2023-06
https://brilliant.org/wiki/quantum-teleportation/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500095.4/warc/CC-MAIN-20230204075436-20230204105436-00356.warc.gz
en
0.927272
1,288
3.984375
4
At the Consumer Electronics Show (CES) this year, IBM announced that its quantum computer Raleigh has achieved the goal of doubling its Quantum Volume to 32, up from 16 last year. As a matter of fact, IBM has been successful in doubling its systems’ Quantum Volume each year since 2017 when its computer Tenerife demonstrated a Quantum Volume of 4. This simply means the computer has doubled its potential for solving more real-world, complex problems. Even for those of you who are well-acquainted with the nuances of traditional computing, the terms ‘quantum computer’ or for that matter ‘Quantum Volume’ may sound Greek. And that’s pretty understandable. Quantum computing involves the application of quantum physics, which may be too abstruse for many of us. Here, in this article, we shall try to deal with the subject of quantum computing in the simplest manner possible. So, let’s begin. Before we dwell on the basics of a quantum computer, let’s understand how a conventional computer works. This way, you will be able to appreciate the difference better. Any traditional computer (like the one you use in your office or home) stores and processes information using switches called transistors. These switches are similar to the ones you use at home for turning on or off your electrical appliances. This transistor can either be on or off. If it’s on, it can be used for storing number 1. If it’s off, it can store 0. Long strings of 0s and 1s can be used to store any number, text or symbol. Each of the 0s and 1s is referred to as a binary digit (bit) and using a string of eight bits, you can store 255 different characters (A-Z, a-z, 0-9 and commonly used symbols). A conventional computer calculates by using circuits called logic gates that are made of transistors connected together. Quantum computers do not use bits to store information. They use quantum bits or qubits. Each qubit can not only be 0 or 1 but also be both 0 and 1 at the same time or even an infinite number of values in between. Now, why does this happen and what does it mean? This happens because quantum bits are subatomic particles i.e. protons and electrons. Because they are subatomic particles, they do not follow the laws of classical physics. They follow quantum physics instead. One of the basic tenets of quantum physics is the principle of superposition where a particle can exist in multiple states simultaneously, at least until the state is measured and collapses into one. So, quantum bits can exist in multiple states (including 0 and 1) at the same time. This simply means they can store multiple values at once and process them simultaneously. So, instead of working in a sequence (i.e. doing one thing only after the previous one is finished), a quantum computer can work in parallel (i.e. doing several things simultaneously). This property makes it a million times faster than a conventional computer. When a quantum computer starts crunching through a problem, the qubits are in their hybrid state. When the solution is found, the qubits collapse into one of the possible states (0 or 1) for returning the solution. For most of us, the existence of quantum computers will make no difference simply because the tasks we implement in our day-to-day lives can be easily accomplished with conventional computers. Quantum computers will only affect (in case they do) the elite research teams. Certain problems are so complex that they cannot be solved using traditional computers. They have too many possible solutions and the solution is derived through trial and error, guessing until the answer is found. It would take traditional computers thousands of years to arrive at the correct solution. The superposition property shown by quantum bits will slash the guessing time exponentially. While traditional computers make one guess at a time, quantum computers will make multiple guesses, thanks to their multiple states. So, calculations requiring complex guesswork such as the simulation of atomic structures will be easy to carry out. As a result, scientists will be able to create new compounds for use in manufacturing. Researchers will also be able to simulate other complex systems like genetic mutation patterns, financial models and economic market forces. This will offer impetus to path-breaking research work in genetics, finance and economics. Yes, like every revolutionary technology, quantum computers have their dark side too. Quantum computing can have significant repercussions on cryptographic encryption that secures our computers and underpins modern internet communication. Robust encryption is what keeps our data and messages secure. But in the presence of exceptionally powerful quantum computers, the odds of bad actors breaking these encryption algorithms to access sensitive information increase exponentially. Though they were proposed around 3 decades back, the concept of quantum computers remains largely theoretical even to this day. Starting with some breakthroughs in the year 2000, a Canadian company D-Wave Systems announced in 2011 that it has created a 128-qubit computer. Later, in 2016, Isaac Chuang from MIT and scientists from the University of Innsbruck developed a 5-qubit, ion-trap computer that could potentially evolve into a powerful encryption buster. Post-2016, more and more breakthroughs were unveiled in this field. In 2017, for instance, Microsoft announced that it has developed a quantum development kit including a special language Q#, especially for quantum computing applications. The very next year, Google unveiled Bristlecone, a 72-qubit quantum processor that could be harnessed for research in areas of quantum simulation, optimization, and machine learning. In October 2019, Google claimed to have attained what it called ‘quantum supremacy’. As per Google, their newly developed 54-qubit processor, Sycamore needed just 200 seconds to compute a super-complex algorithm which even the world’s fastest supercomputer would have taken 10,000 years. This claim was disputed by IBM. As per IBM, the said calculation could be completed by an existing computer in less than 2.5 years and not 10,000 years, as claimed by Google. Figure 2: The chart shows the quantum computing systems produced by organization(s) in qubits between 1998 and 2019. (Source: Statista) So, while we have seen many breakthroughs happening in quantum computing of late, we will need to wait for several decades before we witness its practical applications. Google may have claimed to have achieved the pinnacle of quantum computing, but what Sycamore performed was just a benchmark test that has no real-world applications, so Google can’t deploy it for solving practical problems anytime soon. Besides, quantum bits are stable only at cryogenic temperatures, so only governments and large corporations like IBM and Google can afford to keep a quantum computer at their premises. The rest of us would have to depend on cloud computing. Need more such blogs on quantum computing? Let us know in the comment section below. Thanks for reading.
<urn:uuid:e96cd256-522e-48d3-a0f1-1530497504b7>
CC-MAIN-2023-06
https://cyfuture.com/blog/the-curious-case-of-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500628.77/warc/CC-MAIN-20230207170138-20230207200138-00357.warc.gz
en
0.942309
1,446
4.0625
4
Researchers at the Indian Institute of Science (IISc) have created a novel hybrid of two remarkable materials called graphene and quantum dots, in a breakthrough that may inspire highly efficient and controllable next-generation displays and LEDs. Quantum dots are semiconductor nanocrystals with the potential to revolutionize diverse technologies, from photovoltaics and medical imaging to quantum computing. They can absorb UV light and produce sharp, bright colours, making them especially attractive for next-generation TVs, smartphones and LEDs. However, they are poor electrical conductors, and therefore inefficient to use in devices on their own. To improve their efficiency, researchers have tried combining them with graphene, an excellent conductor. Adding graphene would also confer the ability to tinker with the output even after fabrication, or turn the device on and off at will. Although the combination works well for photo-detectors and sensors, it is practically useless for displays and LEDs, because quantum dots lose their ability to emit light when fused with graphene. By modifying some experimental conditions, IISc scientists have found a way to eliminate this effect, and create a highly efficient and tunable hybrid material. The results, published in ACS Photonics, open up possibilities for a new generation of state-of-the-art displays and LEDs. Quantum dots are extremely tiny particles with properties vastly superior to conventional semiconductors. When activated by UV light, they can produce visible light in different colours depending on their size. Small dots produce blue light, for example, while large ones radiate red. Quantum dots absorb light very well, but they are poor electrical conductors; quantum-dot based devices that convert light to electricity are therefore not very efficient. Graphene, on the other hand, is almost transparent to light, but it is an excellent electrical conductor. When the two are combined, graphene could, in principle, quickly pull the absorbed energy away from quantum dots — cutting down energy loss — and convert it to an electrical signal, for example. This makes it possible to create devices such as photo-detectors with extremely high efficiency. “You get the best of both,” says senior author Jaydeep Kumar Basu, Professor, Department of Physics, IISc. On the flip slide, the energy transfer to graphene leaves quantum dots with almost no energy left to emit light, making it impossible to use them in displays or LEDs. “That is one area where the application of these hybrid materials has not taken off, because of this effect,” says Basu. “Graphene acts like a sponge as far as the quantum dots are concerned. It does not allow any emission.” Basu’s team tried to overcome this “quenching” effect by bringing into play a phenomenon called superradiance. When individual atoms or emitters (such as quantum dots) in a layer are excited, each one emits light independently. Under certain conditions, all the atoms or emitters can be made to emit light cooperatively. This produces a very bright light, with an intensity significantly greater than the sum total of individual emissions. In a previous study, Basu’s team was able to bring about superradiance in a thin layer of quantum dots by combining it with metal nanoparticles under certain experimental conditions. They recreated those conditions in the new quantum dot-graphene hybrid devices to successfully bring about superradiance, which was strong enough to compensate for the quenching. Using models, they found that this happens when individual quantum dots are 5 nm or less apart, and the quantum dot layer and graphene are separated by a distance of 3 nm or less. “We have shown for the first time that we are able to get away from this ‘sponge’ effect, and keep the emitters alive,” says Basu. When superradiance dominated, the intensity of light emitted in the presence of graphene was also found to be three times higher than what could have been achieved using quantum dots alone. “The advantage with graphene is that you can also tune it electrically,” says Basu. “You can vary the intensity by simply changing the voltage or the current.” The study also opens up new avenues for research on understanding how light and matter interact at the nanoscale, the authors say. Reference: Electrically Tunable Enhanced Photoluminescence of Semiconductor Quantum Dots on Graphene, published in ACS Photonics, June 2017. The study was funded by the Department of Science and Technology (Nanomission). Jaydeep Kumar Basu Department of Physics Indian Institute of Science (IISc)
<urn:uuid:7274d39d-d9a2-46de-a0e9-a1fa543c92a0>
CC-MAIN-2023-06
https://iisc.ac.in/events/novel-hybrid-material-may-inspire-highly-efficient-next-gen-displays/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500095.4/warc/CC-MAIN-20230204075436-20230204105436-00358.warc.gz
en
0.934476
1,028
4.125
4
The quantum computer has been making waves in the tech world recently. It is a computer that uses quantum mechanics to function, and it is believed to be far more powerful than traditional computers. What is a quantum computer: A quantum computer is a computer that uses quantum mechanical phenomena to perform calculations. These computers are different in many ways from the computers that are in use today. For example, a quantum computer can be in multiple states simultaneously, whereas a classical computer can only be in one state at a time. This allows quantum computers to perform several calculations at once. How does a quantum computer work: This computer consists of qubits, which are units of information that can exist in more than one state simultaneously. A qubit is like an atom or particle that can exist in more than one state or energy level at the same time. In contrast, traditional bits can store only two values, 0 and 11 Why is the quantum computer important: It has the potential to revolutionize computing. It is believed that these computers will be able to solve problems that are beyond the capabilities of classical computers. It could also be used to create new materials and drugs, and to develop new ways of communication. How far along is quantum computer development: Currently, there are a few working prototypes of quantum computers. However, these computers are not yet powerful enough to perform most tasks that a classical computer can do. Researchers are working on developing more qubits and increasing the power of these computers. What are some real-world applications for quantum computers: These computers have the potential to be used in a variety of fields, including finance, medicine, and logistics. For example, they could be used to develop new financial products, design more effective drugs, and create better algorithms for routing traffic. Are quantum computers dangerous: Some experts have raised concerns about the potential misuse of these computers. It is possible that these computers could be used to hack into secure systems or to create new viruses. However, these concerns are largely speculative at this point. What are the limitations of quantum computers: Currently, these computers are very limited in terms of their power and capacity. They also require extremely low temperatures and a high degree of stability. As a result, these computers are not yet able to perform most tasks that classical computers can do. Despite these limitations, quantum computers have the potential to revolutionize computing and change the world as we know it. Do you think quantum computers are the future: Yes, I believe that these computers are the future of computing. These computers have the potential to solve problems that are currently unsolvable, and they could have a profound impact on many different fields. I think that these computers will become more powerful and more widely used in the coming years. Thanks for reading! I hope you found this article interesting. a quantum computer, qubits, quantum mechanics Did you know that a qubit is a unit of information that can exist in more than one state simultaneously? This allows these computers to perform several calculations at once! Quantum computers Here are 10 fascinating facts about the quantum computer: 1. The quantum computer was first developed in 1994 by Peter Shor. Shor’s algorithm, which is a quantum algorithm for factoring large numbers, was developed in 1994. This algorithm demonstrated that computers could be used to perform calculations that are beyond the capabilities of classical computers. Since then, these computers have been developed by a number of different companies and organizations, including IBM, Google, and Microsoft. 2. this computer can solve problems much faster than traditional computers. Traditional computers use bits, which can store a maximum of two values (0 or 11). These computers use qubits, which can store a much larger number of values. This allows these computers to perform calculations much faster than traditional computers. Fact, it is estimated that this computer could perform certain tasks in seconds that would take traditional computers billions of years to complete! These computers are not yet powerful enough to perform most tasks that a classical computer can do. However, they have the potential to revolutionize computing and change the world as we know it. 3. This computer can hold much more information than traditional computers. This is because qubits can exist in multiple states simultaneously. This allows these computers to perform several calculations at once! Traditional computers use bits, which can store only two values (0 or 11). In contrast, qubits can store a virtually unlimited amount of information. As a result, these computers have the potential to be much 4. This computer is immune to hacking and viruses. This is because these computers use qubits, which are units of information that can exist in more than one state simultaneously. This makes it impossible for hackers to access the data stored on a quantum computer. Additionally, quantum computers are not susceptible to viruses due to their qubit-based architecture. 5. This computer could eventually lead to the development of artificial intelligence. These are just a few of the many fascinating facts about this computer! If you found this article interesting, be sure to check out our other blog posts about computing. a quantum computer, qubits, quantum mechanics 6. These computers are currently being used for data encryption and security purposes In the future, quantum computers could be used for a variety of tasks, including weather forecasting, early detection of disease, and large-scale simulation of molecules.
<urn:uuid:6bb65793-8cc9-4f51-8a0b-143045c05e49>
CC-MAIN-2023-06
https://r4read.info/2022/06/26/what-are-quantum-computers-some-facts-about-it/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500255.78/warc/CC-MAIN-20230205130241-20230205160241-00438.warc.gz
en
0.955996
1,103
3.53125
4
The equivalent to a wormhole in space–time has been created on a quantum processor. Researchers in the US used an advanced quantum teleportation protocol to open the wormhole and send quantum signals through it. By studying the dynamics of the transmitted quantum information, the team gained insights into gravitational dynamics. The experiment could be further developed to explore quantum gravity or string theory. A wormhole is a bridge in space–time that connects two different locations. While wormholes are consistent with Albert Einstein’s general theory of relativity, they have not been observed by physicists. Unlike wormholes in science fiction, they are not traversable – meaning things cannot pass through them. Although general relativity forbids travelling through a wormhole, it is theorized that exotic matter – matter with negative energy density and negative pressure – could open a wormhole and make it traversable. But these theories are difficult to test, even if one could create a wormhole in a lab. But physics has a trick up its sleeve – in the form of the quantum teleportation of information between two entangled particles. This process occurs instantaneously and therefore emulates the process of sending quantum information through a gravitational wormhole. In both cases, however, it is not possible to communicate faster than the speed of light because a subluminal signal is required to decode the information. Quantum entanglement plays an important role in quantum computing, therefore a quantum processors is the ideal experimental device to explore the similarities between quantum teleportation and wormholes. In this scenario, quantum bits – or qubits – on the quantum processor are entangled with each other and teleportation is the equivalent of the qubit travelling through a wormhole. Down the wormhole Now Maria Spiropulu at Caltech, Daniel Jafferis at Harvard University and colleagues have done such an experiment. Their aim was to create a system that has the right ingredients for the type of teleportation that resembles a wormhole. An important challenge that they first had to overcome is that it appeared that a large number of qubits would be needed to perform the experiment properly – many more qubits than are available in today’s quantum processors. To solve this problem, the researchers used machine learning to work out the minimum number of qubits required and how they should be coded to set up the quantum teleportation protocol. They discovered that they could create the wormhole dynamics on nine qubits with 164 two-qubit gates on a Google Sycamore quantum processor. In their experiment the researchers showed that they could keep a wormhole open for a sufficient amount of time by applying negative energy shockwaves, which came in the form of special pulses of quantum fields. They then studied the dynamics of the quantum information that was sent through. Signals that travel through a wormhole experience a series of scrambling and unscrambling, with the quantum information exiting the wormhole intact. On the Sycamore processor, they measured how much quantum information passed from one side to the other, when applying a negative versus a positive energy shockwave. And because only negative energy shockwaves would open up the wormhole, they found that only these shockwaves allowed signals to pass through. Overall, the information passing through the wormhole had key signatures of a traversable wormhole. This constitutes a step towards probing gravitational physics using quantum processors and could lead to the development of powerful testbeds to study ideas of string theory and quantum gravity. Quantum complexity could solve a wormhole paradox Juan Maldacena at the Institute for Advanced Study, in Princeton, US, who was not involved in the research, describes the work as an interesting first step in trying to create complex quantum systems that can have an emergent space–time description. He thinks the result is important because it is a demonstration that allows us to experimentally test some of the theoretical ideas about the connection between quantum mechanics and emergent space–time geometry. He says the research’s biggest achievement is that it has reproduced a kind of quantum teleportation that is inspired by gravitational problems. Team member Daniel Jafferis believes that there are many additional protocols and new ideas to explore and he expects more “gravity experiments” to be performed by quantum computers in the future. He thinks that some of these will require much larger quantum computers or much deeper circuits, but that others are well-suited for near-term experimentation. “One of the things we would like to do next is to realize somewhat larger systems and try to observe more detailed structure of the emergent wormholes and their gravitational dynamics”, he tells Physics World. Edward Witten, also at the Institute for Advanced Study and not involved in this research, says that the authors have done a nice job of describing a simplified version of the protocol that could be realized experimentally. He calls this experiment – and the presumed improvements that may be possible – to be a “milestone” in developing control over microscopic quantum systems. He states that even though such an experiment can certainly not give the sort of information that comes from physics experiments such as LIGO or the LHC, success with such experiments can confirm the validity of quantum mechanics in a rather subtle situation and also confirm that the theory has been analysed correctly. The research is described in Nature.
<urn:uuid:e6dc5a83-a068-405d-9aac-0bbfdc6d0aca>
CC-MAIN-2023-06
https://gulpmatrix.com/quantum-teleportation-opens-a-wormhole-in-space-time/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499967.46/warc/CC-MAIN-20230202070522-20230202100522-00559.warc.gz
en
0.939567
1,082
4.15625
4
After a lot of theorising, companies and research institutions are now announcing plans that bring real-world quantum computers close to reality. Quantum computing is coming – but what will it mean for organisations and industries? What is quantum computing? Quantum computing uses the laws of quantum mechanics to solve problems too complex for traditional computers. Quantum bits (or qubits) are the basic unit of information in quantum computing but, unlike the traditional bit, it can be a one, a zero or both at the same time, so quantum computing breaks free from the constraints of traditional binary code. These qubits can be entangled into multi-qubit states, enabling powerful quantum computation. Today’s digital computers struggle with calculations that involve finding the optimal arrangements of items, because they must work through each permutation to find the best. This can take an enormous number of calculations and, the more complicated the code, the more processing power required, and the longer the processing takes. A quantum computer, on the other hand, uses the multiplicity of states in the quantum world to work through possibilities simultaneously and at much greater speed. The real-life applications this opens up will benefit a wide range of industries in a multitude of ways, some of which we’re yet to fathom. However, the industry uses are likely to focus on two key benefits: - the ability to handle complexity and vast amounts of data - the sheer speed of calculations, scenario assessment and pattern recognition. Quantum computing is adept at handling complexity Perhaps the most astounding benefit of quantum computing is its capacity to process complex problems beyond the capabilities of today’s digital computers – even down to the molecular level. Chemical and biological engineering researchers are excited about the possibilities of discovering, mapping and manipulating molecules, using quantum computing to understand the motion and interaction of subatomic particles. This could open the door to being able to create a room-temperature superconductor, removing carbon dioxide for a better climate, and creating solid-state batteries. And over in pharmacology, researchers will be able to model and simulate interactions between drugs and all 20,000+ proteins encoded in the human genome, pushing knowledge forward and accelerating current efforts in materials discovery and drug development. Quantum computing will provide an effective way of understanding drugs and their reactions on humans, making more drug options available. Artificial intelligence and machine learning will also benefit from quantum developments that can process complex problems and recognise patterns in less time than conventional computers. As a result, diagnostics in healthcare will improve and healthcare professionals will be able to optimise targeted treatments such as radiotherapy by modelling complex scenarios to identify the optimum treatment path. Fraud detection in finance, too, is reliant upon pattern recognition, and quantum computing can potentially improve detection rates. However, quantum computing’s ability to process complex calculations will also force industries to change how they do things, most notably so far in the arena of cybersecurity. Quantum computing has the potential to be able to crack the mathematics that underpins much of the current cryptography that’s used to secure networks. The type of algorithms that are most affected are ‘asymmetric’ algorithms used in key exchange, digital signatures and Public Key Infrastructure certificate-based authentication. But, as well as a threat, quantum computing can also be the solution. Post-Quantum Cryptography will use quantum computing to keep all the functionality of existing cryptography, at the same time as upgrading it to be much harder for quantum computers to break. Quantum computing unlocks record calculation speeds For other industries, the sheer speed of quantum computing is the leading draw. Computing speed has long been a source of advantage in financial markets where hedge funds compete to achieve millisecond advantages in obtaining price information. Quantum computing means the faster calculation of massive, complex scenarios to find the right mix for fruitful investments based on expected returns. Plus, quantum algorithms can increase the speed of market variables analysis, automatically triggering high-frequency trading. Calculation speed is also important in areas such as weather forecasting. Currently, the process of analysing weather conditions by traditional computers can sometimes take longer than the weather itself does to change and, at best, limits forecasters’ ability to warn of weather events. Quantum computing will be able to crunch huge amounts of data at speed, enhancing weather modelling by improving pattern recognition to make prediction more accurate, increasing the amount of warning that can be given. It’ll also be able to generate greater insight into climate change and mitigation. Sifting through possibilities at speed is vital in sectors such as manufacturing and industrial design, too. Developing a working product traditionally takes draft after draft and test after test. However, quantum computing can identify the most effective option rapidly, delivering better designs for a better product. Rapid modelling also has significant, positive implications for logistics and supply chain efficiency. It’ll unlock real-time optimisation, allowing continuous calculation of optimal routes of traffic management, fleet operations, air traffic control and freight and distribution. Welcome to a quantum-powered world The possibilities of quantum computing are, perhaps, beyond full comprehension right now, but there’s clear potential in every industry. In fact, our early trial with the consultancy company, EY, is already yielding some promising results. For more insight into how we’re harnessing quantum computing in our solutions, download our whitepaper.
<urn:uuid:382c9e44-0bec-47d6-ae7c-6dfa14a039ea>
CC-MAIN-2023-06
https://www.globalservices.bt.com/en/insights/blogs/are-you-ready-for-a-future-based-on-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499634.11/warc/CC-MAIN-20230128121809-20230128151809-00359.warc.gz
en
0.920663
1,102
3.59375
4
Whenever you take a closer look at quantum computers and how they work, it’s almost surreal. Not many of us are aware of the massive leap these supercomputers represent in processing power. As mysterious as quantum physics has become to the average person, the same can be said about quantum computers. Their future trajectory will easily surpass today’s most powerful computers. The role of quantum computers Bear in mind that quantum computers aren’t here to replace conventional computers. There is no way that could happen because traditional computers solve too many everyday problems. And they’re also too economical and too easy to use. Instead, quantum computers will be reserved to address the cutting-edge projects in virtually every field – ranging from social issues to engineering methods to pharmaceuticals research. Many companies are already designing and experimenting with research projects that are ideally suited for the newest supercomputer. A prevailing problem for our society, in general, has been our inability to process the mountains of data we generate. We need the processing power to analyze data in a timelier manner. There is where quantum computing methods will earn their keep. The real secret behind the power of a quantum computer is how it generates and processes quantum bits, also known as qubits. What is a qubit? We are familiar with the term ‘bit’ because we define bits of information generated in today’s conventional computers. A bit is simply a stream of electrical pulses that represent either a 1 or 0. All things created by current computers are a long string of bits. Rather than bits, a quantum computer uses qubits, which are comprised of subatomic particles like photons and electrons. As you might imagine, creating and processing qubits is both an engineering and scientific challenge. Some companies choose to use superconductors that can be cooled to temperatures that are colder than deep space. Other companies approach the process by trapping individual atoms in electromagnetic fields that exist on silicon chips within ultra-high-vacuum chambers. Both approaches have the same goals of isolating qubits within a controlled quantum state. Qubits possess some rather outrageous quantum properties that allow them to generate far more computing power than the same number of ordinary binary bits. One such property is called superposition, and another known as entanglement. What is superposition? Qubits are capable of representing countless potential combinations of 1 and 0 simultaneously. This unique ability to represent multiple states is superposition. To lift qubits into superposition, scientists manipulate them with microwave beams and precision lasers. Because of this counterintuitive property, quantum computers with many qubits in superposition can blast through massive numbers of possible outcomes simultaneously. The result of a calculation is rendered only when the qubits have been measured. After this, they collapse out of their quantum state into either a 1 or 0. What is entanglement? Entanglement is where pairs of qubits exist in one quantum state. Thus, whenever the state of one qubit is changed, the same change is instantly applied to the other entangled qubit. Researchers have discovered that this will happen in a predictable way – even when extremely long distances separate them. As to how and why entangled qubits work like this remains a mystery. However, it’s one of the critical reasons that quantum computers are so powerful. In conventional computers, when you double the number of bits, you double its overall processing power. But because of entanglement, the addition of qubits to a quantum computer creates an exponential increase in its ability to process data and information. Imagining the plethora of quantum algorithms that could be designed and executed at this quantum level explains all the excitement about them. But it’s not all good news. One problem with them is they are far more prone to errors than conventional processors. The reason for this is something called ‘decoherence.’ What is decoherence? Decoherence is when the quantum behavior of qubits decays and disappears. Because their quantum state is so fragile, the slightest change in temperature or a tiny vibration – known as outside ‘noise’ – can cause them to fall out of superposition before it has finished its assigned task. Because of their fragile nature, we’ve seen scientists attempt to protect qubits by putting them in vacuum chambers and supercooled environments. Despite these precautions, noise still manages to induce errors in calculations. Quantum algorithms that are cleverly designed can compensate for some of these errors, and the addition of more qubits also seems to help. But as it stands, thousands of standard qubits are required to generate a single, reliable one – and these are called ‘logical’ qubits. Thus, just to get an adequate number of logical qubits, a quantum computer must devote lots of its computational capacity. This brings us to the quantum brick wall that scientists are facing. Thus far, researchers have not been able to create more than 128 standard qubits. So we’re quite a few years away from developing that quantum computer that’ll be useful. Fortunately, this quantum brick wall hasn’t slowed down the efforts of computer researchers who seek to find an answer. What is quantum supremacy? Quantum supremacy is the ultimate goal of researchers. It represents that point where quantum computers can perform mathematical calculations that are way beyond the capabilities of the most powerful supercomputers. And to do so reliably. Currently, no one yet knows how many qubits would be required to achieve such a goal. One reason is that the goalposts keep moving. Other researchers continue to find new ways and algorithms that boost the power of classical computers, and the hardware of supercomputers keeps improving as well. Nonetheless, quantum computer researchers and their sponsors continue working diligently to reach quantum supremacy as they compete against the most powerful supercomputers on the planet. And the research world at large hasn’t given up the cause either. Many companies continue experimenting with them now – instead of waiting for supremacy. Several firms have even allowed outside access to their quantum machines. What will the first quantum computer be used for? There are many promising applications of quantum computers. One such approach is the simulation of matter at the molecular level. Automakers are experimenting with quantum computers to simulate the chemical composition of things like electrical-vehicle batteries to enhance performance. Major pharmaceutical firms use them to compare and analyze compounds that may find new drugs. They can accomplish what previously took humans years to accomplish using conventional methods in just a few days. These quantum machines are phenomenal at analyzing numbers and solving optimization problems incredibly fast. This ability alone makes them extremely valuable in a broad spectrum of disciplines. Even though it could take several years for quantum computers to reach their full potential, it appears to be well worth the effort. One thing that is working against them is that businesses and universities experience a growing shortage of researchers with the required skills. Efforts to bolster STEM candidates are falling woefully short of expectations. Secondly, there are not enough suppliers of the required vital components to support quantum computers. And many companies have shifted their priorities elsewhere at present. Hopefully, we will be able to overcome these obstacles and reach our quantum goals. These new computing machines could completely revolutionize entire industries and boost global innovation to levels never seen before.
<urn:uuid:645f64d7-e39f-4b8b-b3c5-0c37bddf89ca>
CC-MAIN-2023-06
https://themindguild.com/how-quantum-computers-work-why-theyre-powerful/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499953.47/warc/CC-MAIN-20230201211725-20230202001725-00198.warc.gz
en
0.940053
1,516
3.96875
4
Those of us who grew up learning about the double slit experiment, know that the Copenhagen interpretation, very simplified, dictates that an electron can be said to behave as both a particle and a wave…but that we mustn’t care. The interference patterns generated on the detector screen supposedly force us to accept that an electron physically moves through the two slits in the apparatus at once. Statistically, it must, and it is only by measuring that we can find out if we are dealing with a particle or wave behavior. What happens to the electron before the detector screen measures a wave or particle behavior is undetermined. It is a question that cannot be asked (under the Copenhagen view) because we can’t find out. Even stranger, the fact that a detector is or is not observing this experiment, changes the outcome of the experiment at the quantum level. The act of observing is said to alter the experiment. Usually, this is when the teachers stops explaining (and sweating). This is all very unsatisfying. Students want to understand what happens at the slits. Does the particle split in two? Does it change into a wave and back? And what is up with that observer? Why is there only a statistical and not an intuitive explanation? For most people, this is an incredibly difficult concept to grasp. Most students -and teachers- never get beyond this point. It usually takes an advanced course in physics to find out about the underlying principles that allow these experimental results to manifest. You need to be introduced to the idea that everything in nature is a manifestation of some field. Both particles and waves are constituted ‘only’ of fields in various forms. Furthermore, a detector can only “detect” because it generates a ‘force field’ that a particle or wave to be detected can bounce into and interact with. Once a student learns this, it is immediately understood that the particle-wave behavior is just another way the same fields manifest and interact. It again becomes intuitive. (They are usually also very angry not learning about fields immediately.) The ‘observer effect’ is just an extra field, that of the detector, you need to account for in your experimental setup. If you don’t understand this. Don’t worry. Just think about the invisible force field you notice when you play with magnets. That is one of the fields ‘stuff’ is made off. Other types of fields in nature manifest in different ways but they too are detectable. (All of them are part of the standard model of physics. Google it. Find out.) Vice versa. The double slit experimental setup does have a practical use. It can be used as a detector. It detects changes in the surrounding fields. An interesting question is if this so-called ‘observer’, or better ‘detector field’ effect, can also be influenced by people. Our brains and heart muscles create detectable small currents and electric fields, and electric fields interact, so our presence or nearness to the electrical field of a detector must influence what the double slit experiment measures. Correct? This has been tested over and again over the last decades in hundreds of experiments and…apparently it does. An interesting result. Even stranger, it works at large distances. A person doesn’t necessarily need to be in the same room… or even country as the experiment setup. You just need to focus your brain on the little box running the double slit experiment. Euh. What? Wait a minute… Electric fields taper off with distance. Does this mean that the electric fields in our brain our doing something…like quantum entanglement at a distance? Does this mean that the mind interacts with matter… at a distance? Well. On the one hand, the answer is: of course it does. Our mind is just the manifestation of electrical pulses racing around and interacting in the neural network of our brain. Depending on whether you believe our brain is just a glorified ‘wet’ computer, this view suffices to most material reductionists, but religion, philosophical traditions and human experiences have always informed us that science doesn’t really get the human experience. Reductionism, although it does bring many technological advances, seems to overshoot in many cases. Our intuition and mind seem to be able of a lot more than science gives it credit for. Science seems to throw out the baby with the bath water, and even open-minded investigators feel their credibility will suffer when they tackle these subjects, and in doing so damage their career prospects. But what if the reach of our electric fields, and maybe our consciousness reaches much further? Is there something as a consciousness that extends beyond our body? Couldn’t we benefit from finding out? Are we now in a position to ask these questions beyond mere philosophical discourse? Well. We know of at least person who is not afraid to ask. If you are interested in how precisely this question can be asked and tested in a scientific and empirical manner, I recommend the lecture below, given by Dean Radin of the Institute of Noetic Sciences (IONS). He shows there is a way and the results are very intriguing to say the least. For other people who prefer to have a wet computer for a brain and do not believe in consciousness interacting at a distance: Noticing the progress around the world with brain steered prostheses, implanting that wifi-router is less than a decade away. Select your provider wisely.
<urn:uuid:f276bc4d-9e19-4677-b3b6-c0d69b496858>
CC-MAIN-2023-06
https://onestagetospace.com/2017/10/06/can-a-consciousness-field-be-tested-empirically-yes-just-be-sure-to-put-the-detector-at-the-other-end-of-the-earth-and-think-inside-of-the-box/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500357.3/warc/CC-MAIN-20230206181343-20230206211343-00519.warc.gz
en
0.943036
1,130
3.609375
4
A quantum computer doesn’t need to be a single large device but could be built from a network of small parts, new research from the University of Bristol has demonstrated. As a result, building such a computer would be easier to achieve. Many groups of research scientists around the world are trying to build a quantum computer to run algorithms that take advantage of the strange effects of quantum mechanics such as entanglement and superposition. A quantum computer could solve problems in chemistry by simulating many body quantum systems, or break modern cryptographic schemes by quickly factorising large numbers. Previous research shows that if a quantum algorithm is to offer an exponential speed-up over classical computing, there must be a large entangled state at some point in the computation and it was widely believed that this translates into requiring a single large device. In a paper published in the Proceedings of the Royal Society A, Dr Steve Brierley of Bristol’s School of Mathematics and colleagues show that, in fact, this is not the case. A network of small quantum computers can implement any quantum algorithm with a small overhead. The key breakthrough was learning how to efficiently move quantum data between the many sites without causing a collision or destroying the delicate superposition needed in the computation. This allows the different sites to communicate with each other during the computation in much the same way a parallel classical computer would do. We provide algorithms for efficiently moving and addressing quantum memory in parallel. These imply that the standard circuit model can be simulated with low overhead by the more realistic model of a distributed quantum computer. As a result, the circuit model can be used by algorithm designers without worrying whether the underlying architecture supports the connectivity of the circuit. In addition, we apply our results to existing memory intensive quantum algorithms. We present a parallel quantum search algorithm and improve the time-space trade-off for the Element Distinctness and Collision Finding problems. In classical parallel computing, sorting networks provide an elegant solution to the routing problem and simulation of the parallel RAM model. In this paper, we have demonstrated that they can be applied to quantum computing too. The information about the connectivity of a quantum circuit is available before we run the algorithm (at compile time). Using this classical information we have designed an efficient scheme for routing quantum packets. The application of this data-moving algorithm is to distributed quantum computing. We provide an efficient way of mapping arbitrary unconstrained circuits to limited circuits respecting the locality of a graph. Our results already apply to nearest neighbour architectures in the case of a circuit that is highly parallel. The case of emulating a circuit with many concurrent operations on a 1D nearest neighbour machine was covered by Hirata et al. The approach is to use the Insertion/Bubble sort to perform all of the operations in O(N) time-steps which compares favorably to performing each gate in turn in O(N2) depth. We put this idea in a general framework applying to any (connected) graph. Along the way we are able to prove that up to polylogarithmic factors, this approach is optimal. We have shown how the addition of a few long-range (or flying) qubits dramatically increases the power of a distributed quantum computer. Using only O(logN) connections per node enables efficient sorting over the hypercube. A distributed quantum computer with nodes connected according to the hypercube graph would be able to emulate arbitrary quantum circuits with only O(log2 N) overhead. One might expect that a quantum computer requires O(N) connections per node so that each qubit can potentially interact with any other qubit. Our result demonstrates that this is not the case: for a small overhead O(logN) connections suffice. We have presented a new algorithm for accessing quantum memory in parallel. The algorithm is a modification of the data-moving algorithm used in Sections 2 and 3 but where the destinations are quantum data and no longer restricted to form a permutation. The algorithm is extremely efficient; it has an overhead that is scarcely larger than any algorithm capable of accessing even a single entry from memory. Theorem 5 implies that N processors can have unrestricted access to a shared quantum memory. It tells us that the quantum parallel RAM and the circuit models are equivalent up to logarithmic factors. Finally, we demonstrated that the parallel look-up algorithm can be used to optimize existing quantum algorithms. We provided an extension of Grover’s algorithm that efficiently performs multiple simultaneous searches over a physical database, and answered an open problem posed by Grover and Rudolph by demonstrating an improved spacetime trade-off for the Element Distinctness problem. It seems likely that this framework for efficient communication in parallel quantum computing will be a useful subroutine in other memory-intensive quantum algorithms, such as triangle finding, or more generally for frameworks such as learning graphs. Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology. Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels. A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
<urn:uuid:a5a8eba5-0336-4d8c-a165-a95add1ba154>
CC-MAIN-2023-06
https://www.nextbigfuture.com/2013/02/quantum-hypercube-memory-will-enable.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500628.77/warc/CC-MAIN-20230207170138-20230207200138-00360.warc.gz
en
0.924567
1,191
3.921875
4
While the scientific community holds its breath for a large-scale quantum computer that could carry out useful calculations, a team of IBM researchers has approached the problem with an entirely different vision: to achieve more and better results right now, even with the limited quantum resources that exist today. By tweaking their method, the scientists successfully simulated some molecules with a higher degree of accuracy than before, with no need for more qubits. The researchers effectively managed to pack more information into the mathematical functions that were used to carry out the simulation, meaning that the outcome of the process was far more precise, and yet came at no extra computational cost. "We demonstrate that the properties for paradigmatic molecules such as hydrogen fluoride (HF) can be calculated with a higher degree of accuracy on today's small quantum computers," said the researchers, at the same time priding themselves on helping quantum computers "punch above their weight". SEE: Hiring Kit: Computer Hardware Engineer (TechRepublic Premium) Car manufacturer Daimler, a long-term quantum research partner of IBM's, has shown a strong interest in the results, which could go a long way in developing higher-performing, longer-lasting and less expensive batteries. Since 2015, Daimler has been working on upgrading lithium-ion batteries to lithium-sulfur ones – a non-toxic and easily available material that would increase the capacity and speed-of-charging of electric vehicles. Designing a battery based on new materials requires an exact understanding of which compounds should come together and how. The process involves accurately describing all the characteristics of all the molecules that make up the compound, as well as the particles that make up these molecules, to simulate how the compound will react in many different environments. In other words, it is an incredibly data-heavy job, with infinite molecular combinations to test before the right one is found. The classical methods that exist today fail to render these simulations with the precision that is required for a breakthrough such as the one Daimler is working towards. "This is a big problem to develop next-generation batteries," Heike Riel, IBM Research quantum lead, told ZDNet. "Classical computers, and the models we've developed in physics and chemistry for many years still cannot solve those problems." But the task could be performed at speed by quantum computers. Qubits, and their ability to encode different information at the same time, enable quantum algorithms to run several calculations at once – and are expected, one day, to enable quantum computers to tackle problems that are seemingly impossible, in a matter of minutes. To do that, physicists need quantum computers that support many qubits; but scaling qubits is no piece of cake. Most quantum computers, including IBM's, work with less than 100 qubits, which is nowhere near enough to simulate the complex molecules that are needed for breakthroughs, such as lithium-sulfur car batteries. Some of the properties of these molecules are typically represented in computer experiments with a mathematical function called a Hamiltonian, which represents particles' spatial functions, also called orbitals. In other words, the larger the molecule, the larger the orbital, and the more qubits and quantum operations will be needed. "We currently can't represent enough orbitals in our simulations on quantum hardware to correlate the electrons found in complex molecules in the real world," said IBM's team. Instead of waiting for a larger quantum computer that could take in weighty calculations, the researchers decided to see what they could do with the technology as it stands. To compensate for resource limitations, the team created a so-called "transcorrelated" Hamiltonian – one that was transformed to contain additional information about the behavior of electrons in a particular molecule. This information, which concerns the propensity of negatively charged electrons to repel each other, cannot usually fit on existing quantum computers, because it requires too much extra computation. By incorporating the behavior of electrons directly into a Hamiltonian, the researchers, therefore, increased the accuracy of the simulation, yet didn't create the need for more qubits. The method is a new step towards calculating materials' properties with accuracy on a quantum computer, despite the limited resources available to date. "The more orbitals you can simulate, the closer you can get to reproducing the results of an actual experiment," said the scientists. "Better modelling and simulations will ultimately result in the prediction of new materials with specific properties of interest." SEE: Quantum computers are coming. Get ready for them to change everything IBM's findings might accelerate the timeline of events for quantum applications, therefore, with new use cases emerging even while quantum computers work with few qubits. According to the researchers, companies like Daimler are already keen to find out more about the breakthrough. This is unlikely to shift IBM's focus on expanding the scale of its quantum computer. The company recently unveiled a roadmap to a million-qubit system, and said that it expects a fault-tolerant quantum computer to be an achievable goal for the next ten years. According to Riel, quantum simulation is likely to be one of the first applications of the technology to witness real-world impacts. "The car batteries are a good example of this," she said. "Soon, the number of qubits will be enough to generate valuable insights with which you can develop new materials. We'll see quantum advantage soon in the area of quantum simulation and new materials." IBM's roadmap announces that the company will reach 1,000 qubits in 2023, which could mark the start of early value creation in pharmaceuticals and chemicals, thanks to the simulation of small molecules.
<urn:uuid:a36fa325-2e6d-44bc-93e8-1b7ac14cc8fa>
CC-MAIN-2023-06
https://www.zdnet.com/article/less-is-more-ibm-achieves-quantum-computing-simulation-for-new-materials-with-fewer-qubits/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499829.29/warc/CC-MAIN-20230130201044-20230130231044-00401.warc.gz
en
0.95185
1,155
3.65625
4
The underlying principles of modern cryptography rely on fascinating mathematical problems which removes the dependency of covert key sharing between parties to communicate with utmost privacy even in the presence of an adversary. To formulate such a system, Public Key Cryptography/Asymmetric Cryptography was released which used a pair of keys (pk, sk) where pk denotes the public key and (sk) is the secret key. The Public Key is available to all the users but the private key is kept hidden. Both the keys are intertwined through a mathematical function such that encryption can be performed with the public key that can only be decrypted with the private key and vice versa. The relationship between the public key and the private key is through a special kind of one-way functions called Trapdoor functions. What are One way Functions? One way functions possess this hardness property such that it is easy to compute the one-way function but hard to invert. ”Easy” corresponds to the fact that the function can be computed efficiently and ”hard” means that any algorithm attempting to invert it will succeed with a very small probability. Trapdoor functions are one-way functions with additional trapdoor information which allows the inverse to be easily computed. Secure Public-Key Crypto systems are built using a one-way function that has a trapdoor. Consider a public-key crypto system with a pair of keys generated by a key generation algorithm as (pk, sk). If Alice wants to send a message to Bob she uses the public key of Bob to encrypt the message. On other hand, Bob uses his private key pk to decrypt the message. Here pk is the trapdoor information with which the legitimate user Bob could decrypt the message in polynomial time. The candidates of one-way functions and trapdoor functions used in modern cryptography are derived from number theory. Some of the examples of such functions are * Discrete Logarithm Problem, * Factorization and RSA Discrete Logarithm Problem Discrete Logarithm Problem is the construction behind the very first public-key cryptosystem primitive ”Diffie Hellman Key Exchange”. It is based on finding x in the following equation where G is an element in a special algebraic structure called Fields where any power of x of G generates an element in that field. Factorization and RSA Imagine factoring 12 into its prime parts. Within a matter of seconds, one can properly answer 3 and 12. Again for a healthy brain exercise try to calculate the factorization of 128 -- may require some piece of paper, after computing that goes on to a number like 14567978. Here is where the problem starts. Without a calculator or a program, it may take more than a few minutes to solve it. Similarly what if I say factorize a 1024 bits number? Now the situation is out of hand and it seems impossible without a method to solve it. RSA is based on this difficulty of factoring a large number. Of course with a trapdoor as a secret key which is the inverse of the number. Can TrapDoors Expire? Trapdoor functions are the basis of modern cryptography and in turn the heart of cryptographic applications like Blockchains. These functions have single-handedly upheld the security of such networks with goals ranging from encryption to identity management and authenticated transfers. These paradigms have been securing our data since its inception by Diffie-Hellman in the 1990s by iterating to better possibilities and to be universally applicable. But a series of twists have altered the thoughts from cryptographic agility to mere doubts on its security. In 1996, Peter Shor proposed a quantum algorithm that was able to solve the sorcery behind modern cryptography within a feasible time. More precisely, to factor a number N, it takes a time complexity of O(log N). In layman terms, it means an attacker can hijack your account within two hours. So, How are we going to secure our credit card and other sensitive information while purchasing online? What about cryptocurrencies that at the core rely on these paradigms? The answer to the above question lies in the concept of Post-Quantum cryptography which promises to secure our infrastructure from the quantum apocalypse. Archethic (public blockchain by Uniris) realises this threat posed by quantum cryptography and has progressed in parallel to research in the field of cryptography by adding backward compatibility and giving choice to the users for the algorithms. The 2nd article in this series will cover in detail more about Post-quantum cryptography and how it helps in Quantum-Resistant Blockchains. Archethic Public Blockchain Archethic is a Layer 1 aiming to create a new Decentralized Internet. Its blockchain infrastructure is the most scalable, secure & energy-efficient solution on the market thanks to the implementation of a new consensus: "ARCH". Archethic smart contracts expand developers' boundaries by introducing internal oracle, time-triggers, editable content & interpreted language. Through native integration for DeFi, NFTs & decentralized identity; Archethic offers an inclusive and interoperable ecosystem for all blockchains. In order to achieve the long-term vision of an autonomous network in the hands of the world population, we developed a biometric device respecting personal data privacy (GDPR compliant). Making the blockchain world accessible with the tip of a finger. Be the only key! https://www.archethic.net/ Archethic Foundation Non-profit in order to manage decentralized governance of the public blockchain
<urn:uuid:afcec769-b637-4ab1-9d59-7d452b22a798>
CC-MAIN-2023-06
https://blog.archethic.net/trapdoors-and-cryptography/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499826.71/warc/CC-MAIN-20230130165437-20230130195437-00281.warc.gz
en
0.912712
1,162
4.03125
4
The roots of quantum computing come from quantum mechanics, which deals with atomic and subatomic level energy interactions. Quantum mechanics can explain unsolved theoretical physics, chemistry, and technology problems mathematically. Thus, it is considered the steppingstone for the unified theory of the Universe. The major challenge when dealing with quantum mechanics is its inability to simulate the quantum models. A simple quantum system with few interactions requires enormous computing power to simulate. Richard Feynman and Yuri Manin studied this knowledge gap, and they postulated a new genre of computing known as quantum computing. Their introduction to quantum computing relies on the Controlled NOT (CNOT) gate, which can simulate any arbitrary quantum circuit. What is Quantum Computing? When the problem is very complex, and the system needs to process massive data, quantum computers can show the magic. The prominent feature of quantum computing is leveraging the uncertainties, whereas classical computing cannot accept uncertainties. Conventional computing relies on bits with two states, 1 and 0. Quantum computing leverages the Qubits (Quantum bits), taking the states 0, 1, and quantum superpositions of 0 and 1. Arguably, the increased states can increase the computational delay, thus reducing the processing speed. Considering a single operation, this argument is valid, and it may take more time. But the number of processes required for solving a computationally intricate problem can be significantly reduced. That’s how quantum computers can successfully simulate subatomic particle interactions. Apart from Qubits, many other quantum principles such as superposition, quantum entanglement, interference, and coherence are also utilized in quantum architecture. What Change Can Quantum Computing Bring? Even though Quantum computing is in the infant stage, as the technology evolves, it can produce significant advancements in machine learning, material science, Theoretical physics, nuclear physics, chemistry, energy, medicine design, etc. One of the significant threats that scientists are expecting is information security. When quantum computers are in full swing, our existing cryptographic techniques built on the complexity of the mathematical problem will become a cakewalk for quantum computers. This scenario demands a complete restructuring of our information security architecture and existing cryptocurrency architecture. How to Get Your Hands on Quantum Computing? Many cloud-based quantum computing solutions are available for users. The major players are IBM quantum, Google Quantum Computing Service (Cirq), Amazon Bracket, and Microsoft Azure Quantum. Azure quantum is the only service that is not limited to single quantum hardware for implementing your quantum logic. Azure quantum computing service has been in its public preview since early 2021. Why Azure Quantum? Azure quantum computing can present you Quantum Development kit, the best development environment out of the whole spectra of quantum services. The flexibility and adaptability of multiple quantum hardware usages are significant advantages. Quantinum, IONQ, Pasqal, Rigetti, And Quantum circuits are the major quantum service providers in Azure Quantum. In Azure quantum, You can work with popular quantum programming frameworks such as Cirq, Qiskit, and Q#. It provides Quantum Inspired Optimization (QIO), apart from quantum computing service. The major QIO service providers in Azure Quantum are IQbit, Microsoft QIO, and Toshiba SBM. What is Quantum Development Kit? Microsoft Quantum Development Kit (QDK) is the open-source environment for your Microsoft Azure Quantum. You can use python-based SDKs like Qiskit or Cirq. Otherwise, users can leverage the high-level quantum programing language Q#. The resource estimator facility will help you forecast the cost of running your code. How to Develop a Quantum Application Using Azure Quantum? As a baby step, let us consider the creation of random binary bits 0 and 1 in a qubit place holder. Step 1: – Create Azure Quantum workspace. Open Quantum workspace from Azure portal For free usage, you can select the Quick create option. Advanced create contains a few paid hardware services. Azure quantum pricing is significantly less than other quantum computing services in the cloud computing environment. Populate the required detail fields Click on create and wait for deployment Step 2: – Create Jupyter notebook in Azure quantum workspace. Select Notebooks from the Operations section of the left blade. You can select the sample jobs from the sample gallery. For a new notebook, click on the three dots near my notebooks Select new notebook You can select Kernel type, either IPython or IQ#, and provide a file name. Step 3: – Import Workspace Here the coding is based on Python and Q#. The new notebook will contain the code to connect to your quantum workspace by default. You can use connect function to connect to the workspace. You can see the available target instances using the code given below and verify that the required hardware is available. Step 4: – Implement your quantum logic The next step is to build your business logic using quantum programming languages such as Q#, Cirq, and Qiskit. The given example generates a random bit using Q#. We utilized the Measurement, Arrays, and Convert modules for random qubit generation. We submit the job to IonQ hardware, a general-purpose trapped ion quantum computer that executes the function. Before selecting the target, make sure the in the available target list ionq simulator is available. Wait until the job is succeeded. For a better understanding of the result, we can plot the probability of 1 and 0 bits using matplotlib. The process is similar in all the languages like Qiskit and Cirq. MS Learn: https://docs.microsoft.com/en-us/learn/paths/quantum-computing-fundamentals/ CloudThat pioneered cloud training and cloud consulting space in India since 2012. The Cloud arena has identified us as a Cloud-Agnostic organization providing cloud consulting for all major public cloud providers like AWS, Azure, GCP, and others. We provide all-encompassing cloud consulting services that comprise Cloud Consulting & Migration Services, Cloud Media Services, Cloud DevOps & DevSecOps, Cloud Contract Engineering, and Cloud Managed Services. We have a proud clientele that comprises the top 100 fortune 500 companies. Moreover, we have carved a niche in the cloud space by being partnered with all major cloud providers. We are a Microsoft Gold Partner, Advanced AWS Consulting Partner, AWS Authorized Training Partner, Authorized Google Training Partner, and VMware Training Reseller.
<urn:uuid:9e89773c-d17c-4596-8444-d55af1b3a917>
CC-MAIN-2023-06
https://www.cloudthat.com/resources/blog/quantum-computing-as-a-service-qcaas-azure-quantum-taking-baby-steps-towards-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500983.76/warc/CC-MAIN-20230208222635-20230209012635-00441.warc.gz
en
0.870363
1,339
3.703125
4
Sometimes, it’s easy for a computer to predict the future. Simple phenomena, such as how sap flows down a tree trunk, are straightforward and can be captured in a few lines of code using what mathematicians call linear differential equations. But in nonlinear systems, interactions can affect themselves: When air streams past a jet’s wings, the air flow alters molecular interactions, which alter the air flow, and so on. This feedback loop breeds chaos, where small changes in initial conditions lead to wildly different behavior later, making predictions nearly impossible — no matter how powerful the computer. “This is part of why it’s difficult to predict the weather or understand complicated fluid flow,” said Andrew Childs, a quantum information researcher at the University of Maryland. “There are hard computational problems that you could solve, if you could [figure out] these nonlinear dynamics.” That may soon be possible. In separate studies posted in November, two teams — one led by Childs, the other based at the Massachusetts Institute of Technology — described powerful tools that would allow quantum computers to better model nonlinear dynamics. Quantum computers take advantage of quantum phenomena to perform certain calculations more efficiently than their classical counterparts. Thanks to these abilities, they can already topple complex linear differential equations exponentially faster than classical machines. Researchers have long hoped they could similarly tame nonlinear problems with clever quantum algorithms. The new approaches disguise that nonlinearity as a more digestible set of linear approximations, though their exact methods vary considerably. As a result, researchers now have two separate ways of approaching nonlinear problems with quantum computers. “What is interesting about these two papers is that they found a regime where, given some assumptions, they have an algorithm that is efficient,” said Mária Kieferová, a quantum computing researcher at the University of Technology Sydney who is not affiliated with either study. “This is really exciting, and [both studies] use really nice techniques.” The Cost of Chaos Quantum information researchers have tried to use linear equations as a key to unlock nonlinear differential ones for over a decade. One breakthrough came in 2010, when Dominic Berry, now at Macquarie University in Sydney, built the first algorithm for solving linear differential equations exponentially faster on quantum, rather than on classical, computers. Soon, Berry’s own focus shifted to nonlinear differential equations as well. “We had done some work on that before,” Berry said. “But it was very, very inefficient.” The problem is, the physics underlying quantum computers is itself fundamentally linear. “It’s like teaching a car to fly,” said Bobak Kiani, a co-author of the MIT study. So the trick is finding a way to mathematically convert a nonlinear system into a linear one. “We want to have some linear system because that’s what our toolbox has in it,” Childs said. The groups did this in two different ways. Childs’ team used Carleman linearization, an out-of-fashion mathematical technique from the 1930s, to transform nonlinear problems into an array of linear equations. Unfortunately, that list of equations is infinite. Researchers have to figure where they can cut off the list to get a good-enough approximation. “Do I stop at equation number 10? Number 20?” said Nuno Loureiro, a plasma physicist at MIT and a co-author of the Maryland study. The team proved that for a particular range of nonlinearity, their method could truncate that infinite list and solve the equations. The MIT-led paper took a different approach. It modeled any nonlinear problem as a Bose-Einstein condensate. This is a state of matter where interactions within an ultracold group of particles cause each individual particle to behave identically. Since the particles are all interconnected, each particle’s behavior influences the rest, feeding back to that particle in a loop characteristic of nonlinearity. The MIT algorithm mimics this nonlinear phenomenon on a quantum computer, using Bose-Einstein math to connect nonlinearity and linearity. So by imagining a pseudo Bose-Einstein condensate tailor made for each nonlinear problem, this algorithm deduces a useful linear approximation. “Give me your favorite nonlinear differential equation, then I’ll build you a Bose-Einstein condensate that will simulate it,” said Tobias Osborne, a quantum information scientist at Leibniz University Hannover who was not involved in either study. “This is an idea I really loved.” Berry thinks both papers are important in different ways (he wasn’t involved with either). “But ultimately the importance of them is showing that it’s possible to take advantage of [these methods] to get the nonlinear behavior,” he said. Knowing One’s Limits While these are significant steps, they are still among the first in cracking nonlinear systems. More researchers will likely analyze and refine each method — even before the hardware needed to implement them becomes a reality. “With both of these algorithms, we are really looking in the future,” Kieferová said. Using them to solve practical nonlinear problems requires quantum computers with thousands of qubits to minimize error and noise — far beyond what’s possible today. And both algorithms can realistically handle only mildly nonlinear problems. The Maryland study quantifies exactly how much nonlinearity it can handle with a new parameter, R, which represents the ratio of a problem’s nonlinearity to its linearity — its tendency toward chaos versus the friction keeping the system on the rails. “[Childs’ study is] mathematically rigorous. He gives very clear statements of when it will work and when it won’t work,” Osborne said. “I think that’s really, really interesting. That’s the core contribution.” The MIT-led study doesn’t rigorously prove any theorems to bound its algorithm, according to Kiani. But the team plans to learn more about the algorithm’s limitations by running small-scale tests on a quantum computer before moving to more challenging problems. The most significant caveat for both techniques is that quantum solutions fundamentally differ from classical ones. Quantum states correspond to probabilities rather than to absolute values, so instead of visualizing air flow around every segment of a jet’s fuselage, for example, you extract average velocities or detect pockets of stagnant air. “This fact that the output is quantum mechanical means that you still have to do a lot of stuff afterwards to analyze that state,” Kiani said. It’s vital to not overpromise what quantum computers can do, Osborne said. But researchers are bound to test many successful quantum algorithms like these on practical problems in the next five to 10 years. “We’re going to try all kinds of things,” he said. “And if we think about the limitations, that might limit our creativity.”
<urn:uuid:0e1dbff7-36d2-4a38-90f6-46ad91f3233d>
CC-MAIN-2023-06
https://www.quantamagazine.org/new-quantum-algorithms-finally-crack-nonlinear-equations-20210105
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500017.27/warc/CC-MAIN-20230202101933-20230202131933-00683.warc.gz
en
0.932572
1,503
3.75
4
Coffee cups and donuts are one and the same, according to mathematicians. While they might have some serious physical differences (crunching on a coffee cup would be significantly less pleasant than biting into a donut,) their underlying topology, or the way their surfaces bend, are the same because they both have only one hole. These similarities may seem semantic, but making use of topology has enabled scientists to explore futuristic materials and their uses in high-performance technologies and quantum computing. But, a golden-child of topological technology, a material called a topological insulator, might be more mysterious than physicists first believed. In recent years scientists have found that this material, which is prized for its ability to conduct electron flow on its surface while insulating flow from its center, may actually be more fragile than it seems and that its topological barriers may in fact be more free-flowing and less predictable. But, far from letting that news discourage them, scientists in a pair of new studies have worked to uncover how such fragile topology may actually be useful. The first study, published Thursday in the journal Science, used mathematical modeling in order to better understand this material's fragile properties. The team focused on the quantum behavior of electrons both on the surface and in the interior of the topological insulator and how it might be possible in some cases for this strict topological barrier to breakdown. Typically in these materials, the wave function (think an electron GPS signal) of electrons in the center of the insulator spread neatly to the edge of the surface, creating a boundary-correspondence that allows for free flow on the surface and insulation in the center. But, by looking at these electrons more closely, the team discovered that this wasn't always the case. The researchers call this scenario a "twisted bulk-boundary-correspondence" and it creates a fragile topology in which electrons cannot flow on the surface of the material. B. Andrei Bernevig, a professor of physics at Princeton and co-author on both papers, said in a statement that this phenomenon is a breakdown of how researchers typically believe these materials behave. "Fragile topology is a strange beast: It is now predicted to exist in hundreds of materials," said Bernevig. "It is as if the usual principle that we have been relying on to experimentally determine a topological state breaks down." In addition to identifying mechanisms that might be causing these strange fragile topologies, the authors also identified that these wave functions could actually be manipulated to change the boundary-correspondence conditions of the material. The researchers say that this ability to tune the material, essentially turning on and off the conductivity of its surface, could be a useful design element for electronic and optical technologies. However, being based in mathematics, the results of this first study were intriguing but still highly theoretical. It took a second paper, complete with a life-sized, 3D printed crystalline structure, to bring the results to life. In the second paper, also published Thursday in the journal Science, researchers explored how twisted bulk-boundary-correspondence could be physically demonstrated. Being in the world of large, Newtonian physics and no longer the small, quantum world of electrons, the research team used sound waves to represent minuscule electron wave functions. Through experiments, the team was able to demonstrate how twisted bulk-boundary-correspondence could occur on the surface as well as how manipulating the sound waves could change the flow of "electrons" on the surface as well. The lead author of the second paper and physicist at ETH Zurich, Sebastian Huber, said in a statement that these unusual theoretical and experimental results point toward a new overarching understanding of these kinds of materials. "This was a very left-field idea and realization," Huber said. "We can now show that virtually all topological states that have been realized in our artificial systems are fragile, and not stable as was thought in the past. This work provides that confirmation, but much more, it introduces a new overarching principle." And when it comes to further research of this property, Bernevig tells Inverse that there's much left to be explored. "Many things [are left to explore]," says Bernevig. "[W]e know absolutely nothing abt how these states respond to other stimuli: disorder, electric and magnetic fields etc. we know nothing about what happens to these states in the presence of strong interactions. Moreover, the physics breakthrough of last year, a material called twisted bilayer graphene, is theoretically thought to exhibit fragile topology in the bands of interest. Understanding how the topology adds to the remarkable properties of such a material will be crucial to figuring out its puzzles" Abstract for paper 1: A topological insulator reveals its nontrivial bulk through the presence of gapless edge states: This is called the bulk-boundary correspondence. However, the recent discovery of “fragile” topological states with no gapless edges casts doubt on this concept. We propose a generalization of the bulk-boundary correspondence: a transformation under which the gap between the fragile phase and other bands must close. We derive specific twisted boundary conditions (TBCs) that can detect all the two-dimensional eigenvalue fragile phases. We develop the concept of real-space invariants, local good quantum numbers in real space, which fully characterize these phases and determine the number of gap closings under the TBCs. Realizations of the TBCs in metamaterials are proposed, thereby providing a route to their experimental verification. Abstract for paper 2: Symmetries crucially underlie the classification of topological phases of matter. Most materials, both natural as well as architectured, possess crystalline symmetries. Recent theoretical works unveiled that these crystalline symmetries can stabilize fragile Bloch bands that challenge our very notion of topology: Although answering to the most basic definition of topology, one can trivialize these bands through the addition of trivial Bloch bands. Here, we fully characterize the symmetry properties of the response of an acoustic metamaterial to establish the fragile nature of the low-lying Bloch bands. Additionally, we present a spectral signature in the form of spectral flow under twisted boundary conditions This article has been updated to include original comment from the researcher.
<urn:uuid:6720710f-9aa2-45a3-a6c5-8681331bc536>
CC-MAIN-2023-06
https://www.inverse.com/innovation/fragile-topology
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500294.64/warc/CC-MAIN-20230205224620-20230206014620-00803.warc.gz
en
0.939288
1,311
4.09375
4
The telecommunications industry has seen rapid growth in recent years, as more and more people use the internet and phone services to stay connected. This growth has led to increased competition, which has caused telecom systems to become more reliable and efficient. Today, telecom systems are used to connect businesses and individuals all over the world. WHAT NEW TELECOM TECHNOLOGIES ARE ON THE HORIZON? One such technology is 5G.5G is considered the next generation of wireless telecommunications. It utilizes higher frequencies and offers greater speeds and capacity than current wireless networks. Another upcoming technology is blockchain. It helps to secure deals, reduce fraud, and prevent third-party interference. A final technology that is set to revolutionize telecom is quantum computing. Quantum computers are able to perform certain tasks much faster than classical computers. This could lead to significant changes in how we use telecom services, as well as other industries. HOW TELECOM SYSTEMS OPERATE? Telecommunications systems operate through the transmission and reception of signals. Signals are created when an electric current is passed through a conductor, such as wire, cable, or telephone line. The type of signal that is transmitted depends on the type of electrical equipment used to create it. For example, a voice using plain old telephone service (POTS) uses an analog signal. Analog signals can be transmitted over shorter distances because they do not lose their clarity as they travel down the line. Digital signals use pulses of electricity and are more efficient over long distances because they can be compressed into smaller packages. They are also easier to interpret than analog signals. WHAT OUR CURRENT TELECOM SYSTEMS DO AND HOW THEY CAN BE IMPROVED UPON? Telecommunications systems are ubiquitous and fundamental to modern life. They enable people to communicate with each other, exchange information, and access the internet. Today’s telecommunications systems use a variety of technologies, but they all share some common features. First, telecommunications systems use wires to transmit signals between devices. These wires can be situated in different locations, which allows telecom systems to reach far-flung corners of the world. Second, telecommunications systems use centralized servers to store and manage data. This allows telecom systems to keep track of important information and make it available to users in a quick and reliable manner. Third, telecommunications systems use algorithms to determine what data should be sent over which wire. This process is known as transmission scheduling and it determines how much bandwidth each device will get during a given moment in time. HOW TELECOM SYSTEMS ARE USED TO COMMUNICATE? Telecommunications have been around for more than a century and are used to communicate through signals that are transmitted through the air or over water. These signals can be sent through wires, satellites, or radio waves. Telecommunications systems use different technologies to send and receive messages. The most common telecommunications system is the telephone system. Telephone systems use copper wires to transmit voice and data. Telephone networks use switches to routes calls between customers. The telephone system is reliable and easy to use, but it can be slow. HOW TELEPHONE SYSTEMS HAVE EVOLVED OVER THE YEARS? Telephones have evolved over the years in terms of design, features and functionality. The earliest phones were simple devices that allowed people to make and receive calls. Over time, phones became more sophisticated and allowed for more features, such as voice recognition and messaging. In recent years, phones have even become embedded into our everyday lives by being integrated into our vehicles and homes. As phone technology continues to evolve, we can expect even more amazing advances in the future! THE IMPACT OF TELECOMMUNICATIONS ON BUSINESS? Telecommunications are essential for businesses of all sizes, as they provide a platform for communication, networking and collaboration. Telecommunication technology has had a dramatic impact on business in recent years, with many companies now relying on telecommunications to run their operations. Here are some of the ways telecommunications have helped businesses: Employees can stay connected with family and friends while at work. This allows businesses to keep employees organized and productive, while also reducing stress levels. Businesses can communicate with customers and other stakeholders easily and quickly. This allows companies to resolve issues quickly and ensure that everyone is on the same page. Telecommunications allow businesses to conduct business remotely from any location. HOW TELECOM PROVIDERS ARE USING BIG DATA TO IMPROVE CUSTOMER SERVICE AND MARKETING EFFORTS? Telecommunications providers are using big data to improve customer service and marketing efforts. By understanding customer behavior, providers can better serve customers and create more engaging marketing campaigns. For example, Verizon has used data analytics to identify areas of its network where congestion is most likely to occur. The company then creates plans that prioritize traffic flows during times of congestion so that customers have the best possible experience. Similarly, AT&T uses big data to target advertising to specific demographics. By understanding who watches particular shows or reads specific magazines, the company can create ads that are more likely to be seen by those interested in those topics. THE EVOLUTION OF TELECOM SYSTEMS? Telecommunications systems have evolved dramatically over the past few decades. For one, the use of technology has allowed for telecommunications to become more widespread and accessible. Additionally, advancements in technology have allowed telecommunications providers to offer a wider range of services and products. In recent years, telecom providers have also started offering new technologies such as voice over internet protocol (VoIP) and mobile broadband. As telecom systems continue to evolve, it is important for businesses and consumers to keep up with the changes in order to stay ahead of the competition. Telecom systems have evolved over the years to accommodate ever-growing demand for communication. Advances in technology and the need for faster and more reliable service have led to the development of new telecom systems. As always, innovation is key to keeping customers happy and meeting their needs. So, stay ahead of the curve and keep your telecom system up to date with the latest advancements.
<urn:uuid:b1e32da5-a5c4-4a7d-8baa-5b5e72615c3f>
CC-MAIN-2023-06
https://www.myiceweb.com/2022/04/21/the-evolution-of-telecom-systems/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764495001.99/warc/CC-MAIN-20230127164242-20230127194242-00643.warc.gz
en
0.948434
1,231
3.609375
4
In the previous article, I explained the reason why quantum computing is interesting and I simply explained how they’re conceptually different from traditional computers. Recently, I have been wondering how hardware could change when it comes to Quantum computers, so I tried to find answers for this question. In this article I will try cover the difference I found and give some technical details about hardware differences. For instance, a computer is essentially made of: - Motherboard (main board with circuits and electrical CMOS transistors) - Storage (HDD) - Network card - Graphics card - Power supply - Eventually all ports with I/O accessories (monitor, keyboard, mouse…etc) (This is a very simplified summary of main components found in our PCs) - Storage is measured in (mega)bits/bytes. - Network transmissions is measure in (mega)Bits/bytes Per Second. - CPUs are made of multiple cores with their performance measured by their frequency (in Hz/GHz). This means the processing speed is a few billion simple logic operations per second. - Evolution is limited by Moore’s law. If we’re ever to turn to Quantum computers how will these concepts Here are some answers (as far as we know at the time of writing): Please note that quantum computers we have at the moment are usually hybrid, which means we use their computational power in conjunction with a classic computer. They are not meant to replace Classical computers and therefore we don’t really have a classical component Vs. Quantum component for each item. However I will cover what makes a Quantum computer (in the widely used Hybrid model like IBM’s machines) Quantum computers are made of: - Main board with SQUID – “a quantum transistor” - QPU (quantum processing unit), AKA co-processor. - No Storage: By its nature, you can’t save or duplicate information on a quantum computer. There is some work on quantum hard drives and there are workaround involving using DNA or converting qubits to bit and store them on regular HDD. - No RAM: During calculations the qubits themselves hold the data required. There is the qram concept but I could not find any proofs that is being used in practice in current QCs. - Network card: Current (hybrid) Quantum Computers are not networked and they communicate through the paired classical computer. Quantum networks via fiber links as a medium are currently a possible solution. This topic is highly active in the QC research field! - No Graphics card: quantum computers are mainly used for calculations. You’re not going to play games watch videos on them. They’re not designed to perform such tasks. The classical part takes care of that. - Power supply: it differs a bit from the classical one, but it still runs on normal electricity to power the cooling system. - I/O is managed by converting them from/to binary through quantum measurement (The Quantum processors we currently have are always used along with a classical computer to control them.) Differences from the common classical concepts: - Storage is measured in quantum bit qubit (aka: Qbit – the equivalent of bit). - I could not find enough information on quantum networks speed measurements. Afaik, the existing QCs are not directly connected to internet and there is no information regarding the efficiency of a network between two connected quantum computers in the experiments I found in Delft and tum. I am not aware of 2 distant Quantum computers that have been directly connected. - To compare performance, we can’t talk about GHz anymore. However, we need to compare efficiency in terms of time-complexity. The processing power of QCs is measured in teraflops – rate of flipping (trillions of logic operations per second). Each QPU has a finite number of qubits. The more qubits the faster its performance. The fastest I heard of is 2000 Qubits computer. (not universal) - Quantum computing still can’t break Moore’s law, since it is a law about transistors in a dense integrated circuit. Therefore, the law does not even apply to Quantum computers (QCs) and thus QCs are not affected by it. However, another law exists for QCs, it’s called rose’s law. Some information about quantum storage technologies: In order to find the answers I was looking for and write this article I had to read some resources, here’s the list: I hope you liked my article! I’m thinking on writing my next article on Quantum networks and quantum cloud computing. I’m always wondering what happens if we host websites on quantum computers? How can quantum computers revolutionize the web? Is such a thing possible or useful? Maybe I will gather more information and write my next article to talk about that (spoiler: remember we said quantum won’t replace classical) 🙄
<urn:uuid:095a1ce7-3d8c-40d2-842e-b6097e8af86c>
CC-MAIN-2023-06
https://mohamedh.me/blog/quantum-computers-vs-classical-computers-hardware-components-differences/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500837.65/warc/CC-MAIN-20230208155417-20230208185417-00204.warc.gz
en
0.927745
1,191
3.75
4
Scanning electron microscope image of a device, similar to the one used. Highlighted are the positions of the tuning gates (red), the microwave antenna (blue), and the single electron transistor used for spin readout (yellow). Credit: Guilherme Tosi & Arne Laucht/UNSW Australian engineers have created a new quantum bit which remains in a stable superposition for 10 times longer than previously achieved, dramatically expanding the time during which calculations could be performed in a future silicon quantum computer. The new quantum bit, made up of the spin of a single atom in silicon and merged with an electromagnetic field – known as ‘dressed qubit’ – retains quantum information for much longer that an ‘undressed’ atom, opening up new avenues to build and operate the superpowerful quantum computers of the future. The result by a team at Australia’s University of New South Wales (UNSW), appears today in the online version of the international journal, Nature Nanotechnology. “We have created a new quantum bit where the spin of a single electron is merged together with a strong electromagnetic field,” said Arne Laucht, a Research Fellow at the School of Electrical Engineering & Telecommunications at UNSW, and lead author of the paper. “This quantum bit is more versatile and more long-lived than the electron alone, and will allow us to build more reliable quantum computers.” Find your dream job in the space industry. Check our Space Job Board » Building a quantum computer has been called the ‘space race of the 21st century’ – a difficult and ambitious challenge with the potential to deliver revolutionary tools for tackling otherwise impossible calculations, such as the design of complex drugs and advanced materials, or the rapid search of massive, unsorted databases. Its speed and power lie in the fact that quantum systems can host multiple ‘superpositions’ of different initial states, which in a computer are treated as inputs which, in turn, all get processed at the same time. “The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations,” said Andrea Morello, leader of the research team and a Program Manager in the Centre for Quantum Computation & Communication Technology (CQC2T) at UNSW. “Our decade-long research program had already established the most long-lived quantum bit in the solid state, by encoding quantum information in the spin of a single phosphorus atom inside a silicon chip, placed in a static magnetic field,” he said. What Laucht and colleagues did was push this further: “We have now implemented a new way to encode the information: we have subjected the atom to a very strong, continuously oscillating electromagnetic field at microwave frequencies, and thus we have ‘redefined’ the quantum bit as the orientation of the spin with respect to the microwave field.” The results are striking: since the electromagnetic field steadily oscillates at a very high frequency, any noise or disturbance at a different frequency results in a zero net effect. The researchers achieved an improvement by a factor of 10 in the time span during which a quantum superposition can be preserved. Specifically, they measured a dephasing time of T2*=2.4 milliseconds – a result that is 10-fold better than the standard qubit, allowing many more operations to be performed within the time span during which the delicate quantum information is safely preserved. “This new ‘dressed qubit’ can be controlled in a variety of ways that would be impractical with an ‘undressed qubit’,”, added Morello. “For example, it can be controlled by simply modulating the frequency of the microwave field, just like in an FM radio. The ‘undressed qubit’ instead requires turning the amplitude of the control fields on and off, like an AM radio. “In some sense, this is why the dressed qubit is more immune to noise: the quantum information is controlled by the frequency, which is rock-solid, whereas the amplitude can be more easily affected by external noise”. Since the device is built upon standard silicon technology, this result paves the way to the construction of powerful and reliable quantum processors based upon the same fabrication process already used for today’s computers. The UNSW team leads the world in developing quantum computing in silicon, and Morello’s team is part of the consortium of UNSW researchers who have struck a A$70 million deal between UNSW, the researchers, business and the Australian government to develop a prototype silicon quantum integrated circuit – the first step in building the world’s first quantum computer in silicon. A functional quantum computer would allow massive increases in speed and efficiency for certain computing tasks – even when compared with today’s fastest silicon-based ‘classical’ computers. In a number of key areas – such as searching large databases, solving complicated sets of equations, and modelling atomic systems such as biological molecules and drugs – they would far surpass today’s computers.They would also be enormously useful in the finance and healthcare industries, and for government, security and defence organisations. Quantum computers could identify and develop new medicines by greatly accelerating the computer-aided design of pharmaceutical compounds (and minimising lengthy trial and error testing), and develop new, lighter and stronger materials spanning consumer electronics to aircraft. They would also make possible new types of computational applications and solutions that are beyond our ability to foresee. Source: University of New South Wales Arne Laucht et al, A dressed spin qubit in silicon, Nature Nanotechnology (2016). DOI: 10.1038/nnano.2016.178
<urn:uuid:29adaf27-2c9a-48cb-8977-c1b00d0bdc50>
CC-MAIN-2023-06
https://sciencebulletin.org/quantum-computers-10-fold-boost-in-stability-achieved/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499891.42/warc/CC-MAIN-20230131222253-20230201012253-00365.warc.gz
en
0.925071
1,202
3.734375
4
Quantum computers have the potential to drive major advances in everything from medicine, to manufacturing, to the way we produce materials. But while quantum computers could help us solve problems currently out of reach of conventional computers, they are also much more prone to making mistakes—mainly because they are incredibly sensitive to the smallest changes in their environment. Amazon launched the AWS Center for Quantum Computing in 2019 with a goal of accelerating the development of quantum computing technologies and applications. Now, the company is launching a new facility for quantum computing at the California Institute of Technology (Caltech) with the ambitious goal of building a "fault-tolerant" quantum computer. Teams will focus on developing more powerful quantum computing hardware and identifying new applications for quantum technologies. Here are five challenges they'll be wrapping their heads around: Making more and better qubits Conventional computers use bits—usually represented as a value of 1 or 0 in code—as their most basic unit of information. A bit can be anything with two distinct states. For example, a light that is either on or off, or a door that is either open or closed. But quantum computers use quantum bits, or "qubits"—usually elementary particles such as electrons or photons—to make calculations. Unlike bits, qubits can be manipulated to exist in a quantum state known as superposition, where they are both 1 and 0 at the same time, as well as all the possible states in between. This, along with some other equally mind-bending behaviors of qubits in a quantum state, allow quantum computers to perform certain calculations exponentially more efficiently than any current or future conventional computer. The AWS team will construct qubits from superconducting materials, such as aluminum patterned into electrical circuits on silicon microchips. That's because the techniques to manufacture these are well understood, making it possible to produce many more qubits, in a repeatable way, and at scale. Keeping the noise down The ability of qubits to exist in a quantum state is what gives quantum computers the potential to be massively more powerful than conventional computers when performing certain calculations. But keeping qubits in this state is—to put it mildly—a massive headache. Even the tiniest changes in their environment (referred to by quantum scientists as "noise"), such as vibrations or heat, can knock them out of superposition, causing them to lose information and become more error prone. The key to building successful quantum computers lies in controlling these errors. One area AWS will be investing in is material improvements to reduce noise, such as superconductors with surfaces prepared one atomic layer at a time, to minimize defects. Developing a bigger quantum computer One of the most challenging aspects of building quantum computers is how to scale them up. To go beyond the realms of what's already possible with conventional computers, they will need to be much, much larger than current machines. Today's quantum computers are "noisy" and error prone. The goal for quantum researchers is to scale from a handful of noisy qubits to a machine with hundreds and then thousands of very low-noise qubits. The new AWS facility includes everything the teams need to push the boundaries of quantum research and development, including technologies required to support bigger quantum devices, such as cryogenic cooling systems to protect devices from thermal noise and nanoscale fabrication tools required to construct new forms of quantum circuits. Reducing the cost of error correction Aside from investing in innovations to reduce noise, AWS will also be working on building error correction into quantum computing hardware, using redundant sets of physical qubits to form so-called "logical" qubits, which encode quantum information and can be used to detect and correct errors. Performing error correction in this way is typically very expensive and resource intensive, due the large amount of physical hardware required to generate logical qubits. AWS is researching ways to reduce these costs by designing more efficient methods of implementing error correction into quantum hardware. Speeding up the clock There's more to building a useful quantum computer than simply increasing the number of qubits. Another important metric is the computer's clock speed, or the time it takes to perform "quantum gate operations" and do so accurately. (Quantum gates are essentially the building blocks of quantum circuits—the models by which quantum computers make calculations.) This is where superconducting qubits offer an advantage, because they make it easier to speed up quantum gates. As AWS tries to build better qubits, its ultimate measure of success will be the extent to which it can speed up the clock while reducing quantum gate errors. The AWS Center for Quantum Computing The AWS Center for Quantum Computing brings together quantum computing experts from Amazon, Caltech, and other top academic research institutions. The center also offers scholarships and training opportunities for students and young faculty members, helping to support the quantum scientists of the future. The center's ultimate goal is to build an entirely new type of computer: a fault-tolerant quantum machine able to perform accurate computations beyond anything offered by conventional computing technology at the scale needed to solve complex problems that could have a major impact on how we all live and work. Learn more about the AWS Center for Quantum Computing.
<urn:uuid:103556e2-31d8-4d13-8b26-a5008e0d42cc>
CC-MAIN-2023-06
https://www.aboutamazon.com/news/aws/aws-launches-new-quantum-computing-center
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500339.37/warc/CC-MAIN-20230206113934-20230206143934-00285.warc.gz
en
0.933387
1,063
3.796875
4
What is Quantum Computing? Quantum Computers or Quantum Computing machines are the small machines that perform basic computations that are based on different properties of quantum physics, classical computers, and many more which include a wide variety of gadgets like smartphones, laptops convert the information in binary which are 0’s and 1’s (bits). In Quantum computers each basic unit of memory is a quantum bit or qubit. What is Qubit? A qubit or a quantum bit is derived from a unit of a memory which is a basic unit of quantum information. It is a quantum version of the classical binary bit which physically realized in two different states. To understand further let’s see a quick example Suppose let’s consider a mobile phone. Let it be a modern smartphone, knows where you are on the planet, all the time often to within meters. That’s very useful if you are trying to find your way in some unfamiliar part of town, but if you pause for a moment to think about this, it’s actually a bit creepy how does your phone do? Global Position System The answer is of course by a GPS (Global Position System). At any time there are between 24 to 34 working GPS satellites in the orbit, with an altitude of 20,000 KM and if your phone receives signals from at least 4 of them, it can work out where it is. That’s because the satellite knows where they will tell your mobile phone. The time delay in receiving a message from the satellite to phone determines how far away the satellite is from your phone which means distance is time delay times the speed of light. Knowing the distance of four satellites then allows your phone to triangulate and establish where it is. However, light travels about 30cm in one nanosecond, so in order to get your location down to within a meter, the error in the time delay can only be few nanoseconds. So, the precise normal clocks are not sufficient we need to use much more precise atomic clocks. So, to monitor how atomic clock takes transitions of an atom we need quantum mechanics so, you know where you are on the planet every time you check your phone’s map because of quantum mechanics which is a part of quantum computing. Another way in which quantum mechanics impact our life is via transistors. These are tiny devices a few tense of nanometers which are typically made from silicon, gallium, or some other semiconducting material, transistors are used as very fast current switches in microchips and they can be made to perform logic operations. Very large computations can be performed in seconds on a chip consisting of transistors. A typical mobile phone chip has several billion transistors. What is a Transister? A semiconductor transistor is made up of two types of semi conducting material called p-type and n-type. The type indicates whether the current in the semiconductor is carried by electrons (n-type) or holes (p-type). A hole are where an electron is supposed to sit to fill the shell, but is missing instead, in a transistor we can either have n type between the two layers of p-type or the other way around gives us npn or pnp types of transistors. Depending on the voltage that we apply to the middle layer we can open or close the flow of current between the outer layers. What is a Bit? Even though the computer operates on the principle of quantum mechanics, the actual logic carried out on the computers is zeros and ones. The unit of information in computers, the “bit”, is an abstract idea that represents a physical system that can be in two distinct states. For example, a light bulb can be on or off, wire in a computer can carry a current, a capacitor can carry a charge or no charge. In every case, the system is either on or off we can apply logic to these states as follows if we have three light bulbs A, B&C and set up a circuit such that c is on whenever both A& B are on, then the state of light bulb C is the logical value of A&B. So, in the previous example, let’s return to the atomic transitions, every transition has two states ground state and excited state. When we return an atom to the atomic clock then the transition takes place. We can also call these two states zero and one, giving us a tiny bit. But it gives us more, the atom can be in one of these two states and also in a quantum superposition of these two states. This is in fact how quantum mechanics allows us to calculate how the atom jumps between these states in an atomic clock. Now we can see that at a fundamental quantum mechanical level, a bit is not just a system with two states that is labeled zero and one but also allows superpositions between zero and one. This gives us how to manipulate the information stored in the system and it turns out that computers built on quantum principles are more powerful than ordinary computers. To distinguish the fundamental quantum systems in this new type of computer, we call them quantum bits or qubits. There are many ways in which we can construct qubits besides the atomic levels they are electronic spin qubits, photon polarization qubits, and superconducting qubits, and so on. How does Quantum computing work with python? The operations which take place from one classical computing state to superposition state is just only one which mostly looks either a 0 or 1. The combination of information also is in the form of 0 and 1. It only takes two operations to create entanglement between two qubits. IBM was the first company to put a first quantum computer on cloud extending the reach of technologies beyond the research laboratories. Nowadays there is a high scope of quantum developers in the real world, wondering if you, too, should get quantum ready? The short answer will be always YES! Quantum Computing Using Python In quantum computing using python, developers mostly use Qiskit. Qiskit is an open-source Programming platform that uses python language. If you are aware of Python, then you’re on the right path to take advantage of superposition, entanglement, and interface in quantum programs. For seasoned developers in the industry looking to explore the potential applications of quantum computing, the Qiskit element Aqua (algorithms for quantum computing applications) offers a library of algorithms for artificial intelligence, chemistry, finance, and optimization. For example, there are a number of finance-related tutorials to experiment with credit risk analysis, fixed income pricing, basket option pricing, and others. The field of Quantum computing is developing more and more the above content was just a small part of quantum computing works with python. This post is contributed by Syed Rizwan. In this article, I will show you the Top 25 Pattern Programs in C++ that you must learn which will help you to understand the usage of nested loops. We will learn to create different geometrical shapes like Squares, triangles, Rhombus, etc. We have provided the code as well as output for all the patterns… In this article, we will build a simple program for Currency Converter in C++. The user will be able to convert currencies like Dollar, Euro, Pound, and Rupee. This is a console application and very simple to understand as we only use a bunch of conditionals to write the entire program. Features Users can convert… CRUD stands for Create Read Update Delete. I will show you how to perform CRUD Operations in Python. You need basic Tkinter and SQLite knowledge before you read further. This app is straightforward, when you will open this app, a GUI with 4 green colored buttons will open to perform CRUD(create read update delete) operations.… In this article, we will build a simple Number Guessing Game in C++. It’s a game in which the player has to guess a secret number in a range that is generated by the computer and upon a wrong guess by the player, the game will give hints to the player to guess the correct… Hello friends, in this article, we will learn how to create an Image background remover in Python. I will first show you a simple program and then you will see a smile GUI made using tkinter. We need to install rembg, you can install rembg using the pip command: pip install rembg Image background remover… Hello everyone, have you ever wondered while looking inside a big C++ project structure what are these different folders, subfolders, and files used for? Especially when we have just started working with C++ we get lots of questions and confusion regarding this. So let’s talk about the C++ Project Structure of a general application. Whenever…
<urn:uuid:55fb7ec8-c9e1-440a-b283-e18b8f6274f5>
CC-MAIN-2023-06
https://copyassignment.com/quantum-computing-using-python/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500251.38/warc/CC-MAIN-20230205094841-20230205124841-00325.warc.gz
en
0.924124
1,833
3.671875
4
Scientists have shown how an optical chip can simulate the motion of atoms within molecules at the quantum level, which could lead to better ways of creating chemicals for use as pharmaceuticals. An optical chip uses light to process information, instead of electricity, and can operate as a quantum computing circuit when using single particles of light, known as photons. Data from the chip allows a frame-by-frame reconstruction of atomic motions to create a virtual movie of a molecule’s quantum vibrations, which is what lies at the heart of the research published today in Nature. “We program the chip, mapping its components to the structure of a particular molecule, say ammonia, then simulate how a particular vibrational pattern evolves over some time interval.” These findings are the result of a collaboration between researchers at the University of Bristol, MIT, IUPUI, Nokia Bell Labs, and NTT. As well as paving the way for more efficient pharmaceutical developments, the research could prompt new methods of molecular modelling for industrial chemists. When lasers were invented in the 1960s, experimental chemists had the idea of using them to break apart molecules. However, the vibrations within molecules rapidly redistribute the laser energy before the intended molecular bond is broken. Controlling the behaviour of molecules requires an understanding of how they vibrate at the quantum level. But modelling these dynamics requires massive computational power, beyond what we can expect from coming generations of supercomputers. The Quantum Engineering and Technology Labs at Bristol have pioneered the use of optical chips, controlling single photons of light, as basic circuitry for quantum computers. Quantum computers are expected to be exponentially faster than conventional supercomputers at solving certain problems. Yet constructing a quantum computer is a highly challenging long-term goal. As reported in Nature, the team demonstrated a new route to molecular modelling that could become an early application of photonic quantum technologies. The new methods exploit a similarity between the vibrations of atoms in molecules and photons of light in optical chips. Bristol physicist Dr Anthony Laing, who led the project, explained: “We can think of the atoms in molecules as being connected by springs. Across the whole molecule, the connected atoms will collectively vibrate, like a complicated dance routine. At a quantum level, the energy of the dance goes up or down in well-defined levels, as if the beat of the music has moved up or down a notch. Each notch represents a quantum of vibration. “Light also comes in quantised packets called photons. Mathematically, a quantum of light is like a quantum of molecular vibration. Using integrated chips, we can control the behaviour of photons very precisely. We can program a photonic chip to mimic the vibrations of a molecule. “We program the chip, mapping its components to the structure of a particular molecule, say ammonia, then simulate how a particular vibrational pattern evolves over some time interval. By taking many time intervals, we essentially build up a movie of the molecular dynamics.” First author Dr Chris Sparrow, who was a student on the project, spoke of the simulator’s versatility: “The chip can be reprogrammed in a few seconds to simulate different molecules. In these experiments we simulated the dynamics of ammonia and a type of formaldehyde, and other more exotic molecules. We simulated a water molecule reaching thermal equilibrium with its environment, and energy transport in a protein fragment. “In this type of simulation, because time is a controllable parameter, we can immediately jump to the most interesting points of the movie. Or play the simulation in slow motion. We can even rewind the simulation to understand the origins of a particular vibrational pattern.” Joint first author, Dr Enrique Martín-Lopéz, now a Senior Researcher with Nokia Bell Labs, added: “We were also able to show how a machine learning algorithm can identify the type of vibration that best breaks apart an ammonia molecule. A key feature of the photonic simulator that enables this is its tracking of energy moving through the molecule, from one localised vibration to another. Developing these quantum simulation techniques further has clear industrial relevance.” The photonic chip used in the experiments was fabricated by Japanese Telecoms company NTT. Dr Laing explained the main directions for the future of the research: “Scaling up the simulators to a size where they can provide an advantage over conventional computing methods will likely require error correction or error mitigation techniques. And we want to further develop the sophistication of molecular model that we use as the program for the simulator. Part of this study was to demonstrate techniques that go beyond the standard harmonic approximation of molecular dynamics. We need to push these methods to increase the real-world accuracy of our models. “This approach to quantum simulation uses analogies between photonics and molecular vibrations as a starting point. This gives us a head start in being able to implement interesting simulations. Building on this, we hope that we can realise quantum simulation and modelling tools that provide a practical advantage in the coming years.” - Quantum Impossibilities: A conversation with whurley - Seeing is believing: precision atom qubits achieve milestone - He Built the Xbox—Can He Make a Microsoft Product Out of Quantum Computing? - UCLA scientists merge statistics, biology to produce important new gene computational tool
<urn:uuid:d68a90e6-cc33-44c0-8e97-716cc1963e2a>
CC-MAIN-2023-06
https://ageofrobots.net/scientists-use-a-photonic-quantum-simulator-to-make-virtual-movies-of-molecules-vibrating/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764494986.94/warc/CC-MAIN-20230127132641-20230127162641-00525.warc.gz
en
0.91515
1,105
3.921875
4
Quantum computing is the use of quantum-mechanical phenomena such as superposition and entanglement to perform computation. Computers that perform quantum computations are known as quantum computers. Quantum computers are believed to be able to solve certain computational problems, such as integer factorization (which underlies RSA encryption), substantially faster than classical computers. The study of quantum computing is a subfield of quantum information science. Quantum computing began in the early 1980s, when physicist Paul Benioff proposed a quantum mechanical model of the Turing machine. Richard Feynman and Yuri Manin later suggested that a quantum computer had the potential to simulate things that a classical computer could not. In 1994, Peter Shor developed a quantum algorithm for factoring integers that had the potential to decrypt RSA-encrypted communications. Despite ongoing experimental progress since the late 1990s, most researchers believe that “fault-tolerant quantum computing [is] still a rather distant dream.” In recent years, investment into quantum computing research has increased in both the public and private sector. On 23 October 2019, Google AI, in partnership with the U.S. National Aeronautics and Space Administration (NASA), published a paper in which they claimed to have achieved quantum supremacy. While some have disputed this claim, it is still a significant milestone in the history of quantum computing. There are several models of quantum computing, including the quantum circuit model, quantum Turing machine, adiabatic quantum computer, one-way quantum computer, and various quantum cellular automata. The most widely used model is the quantum circuit. Quantum circuits are based on the quantum bit, or “qubit”, which is somewhat analogous to the bit in classical computation. Qubits can be in a 1 or 0 quantum state, or they can be in a superposition of the 1 and 0 states. However, when qubits are measured the result is always either a 0 or a 1; the probabilities of these two outcomes depend on the quantum state that the qubits were in immediately prior to the measurement. Computation is performed by manipulating qubits with quantum logic gates, which are somewhat analogous to classical logic gates. There are currently two main approaches to physically implementing a quantum computer: analog and digital. Analog approaches are further divided into quantum simulation, quantum annealing, and adiabatic quantum computation. Digital quantum computers use quantum logic gates to do computation. Both approaches use quantum bits or qubits.:2–13 There are currently a number of significant obstacles in the way of constructing useful quantum computers. In particular, it is difficult to maintain the quantum states of qubits as they are prone to quantum decoherence, and quantum computers require significant error correction as they are far more prone to errors than classical computers. Any computational problem that can be solved by a classical computer can also, in principle, be solved by a quantum computer. Conversely, quantum computers obey the Church–Turing thesis; that is, any computational problem that can be solved by a quantum computer can also be solved by a classical computer. While this means that quantum computers provide no additional power over classical computers in terms of computability, they do in theory provide additional power when it comes to the time complexity of solving certain problems. Notably, quantum computers are believed to be able to quickly solve certain problems that no classical computer could solve in any feasible amount of time—a feat known as “quantum supremacy.” The study of the computational complexity of problems with respect to quantum computers is known as quantum complexity theory. Besides factorization and discrete logarithms, quantum algorithms offering a more than polynomial speedup over the best known classical algorithm have been found for several problems, including the simulation of quantum physical processes from chemistry and solid state physics, the approximation of Jones polynomials, and solving Pell’s equation. No mathematical proof has been found that shows that an equally fast classical algorithm cannot be discovered, although this is considered unlikely. However, quantum computers offer polynomial speedup for some problems. The most well-known example of this is quantum database search, which can be solved by Grover’s algorithm using quadratically fewer queries to the database than that are required by classical algorithms. In this case, the advantage is not only provable but also optimal, it has been shown that Grover’s algorithm gives the maximal possible probability of finding the desired element for any number of oracle lookups. Several other examples of provable quantum speedups for query problems have subsequently been discovered, such as for finding collisions in two-to-one functions and evaluating NAND trees. Problems that can be addressed with Grover’s algorithm have the following properties: - There is no searchable structure in the collection of possible answers, - The number of possible answers to check is the same as the number of inputs to the algorithm, and - There exists a boolean function which evaluates each input and determines whether it is the correct answer For problems with all these properties, the running time of Grover’s algorithm on a quantum computer will scale as the square root of the number of inputs (or elements in the database), as opposed to the linear scaling of classical algorithms. A general class of problems to which Grover’s algorithm can be applied is Boolean satisfiability problem. In this instance, the database through which the algorithm is iterating is that of all possible answers. An example (and possible) application of this is a password cracker that attempts to guess the password or secret key for an encrypted file or system. Symmetric ciphers such as Triple DES and AES are particularly vulnerable to this kind of attack. This application of quantum computing is a major interest of government agencies. Since chemistry and nanotechnology rely on understanding quantum systems, and such systems are impossible to simulate in an efficient manner classically, many believe quantum simulation will be one of the most important applications of quantum computing. Quantum simulation could also be used to simulate the behavior of atoms and particles at unusual conditions such as the reactions inside a collider. Quantum annealing or Adiabatic quantum computation relies on the adiabatic theorem to undertake calculations. A system is placed in the ground state for a simple Hamiltonian, which is slowly evolved to a more complicated Hamiltonian whose ground state represents the solution to the problem in question. The adiabatic theorem states that if the evolution is slow enough the system will stay in its ground state at all times through the process. The Quantum algorithm for linear systems of equations, or “HHL Algorithm”, named after its discoverers Harrow, Hassidim, and Lloyd, is expected to provide speedup over classical counterparts. The above is a brief about Quantum Computing. Watch this space for more updates on the latest trends in Technology.
<urn:uuid:aa2eb667-876d-4bed-9f7d-f7bd1cef3cfe>
CC-MAIN-2023-06
https://blog.amt.in/index.php/2023/01/19/insights-on-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499829.29/warc/CC-MAIN-20230130201044-20230130231044-00407.warc.gz
en
0.940825
1,409
4
4
Physicists see 27th dimension of photons By Jesse Emspak Published January 29, 2014 Scientists find a way to directly measure quantum states, such as momentum, of photons. (MPQ, Quantum Dynamics Division.) Quantum computers and communications promise more powerful machines and unbreakable codes. But to make them work, it’s necessary to measure the quantum state of particles such as photons or atoms. Quantum states are numbers that describe particle characteristics such as momentum or energy. But measuring quantum states is difficult and time-consuming, because the very act of doing so changes them, and because the mathematics can be complex. Now, an international team says they found a more efficient way to do it, which could make it simpler to build quantum-mechanical technologies. In a study detailed in the Jan. 20 issue of the journal Nature Communications, researchers from the University of Rochester and the University of Glasgow took a direct measurement of a photon’s 27-dimensional quantum state. These dimensions are mathematical, not dimensions in space, and each one is a number that stores information. To understand a 27-dimensional quantum state, think about a line described in 2 dimensions. A line would have a direction in the X and Y coordinates 3 inches left and 4 inches up, for instance. The quantum state has 27 such coordinates. [Quantum Physics: The Coolest Little Particles in Nature] “We chose 27, kind of to make a point about 26 letters in the alphabet and throwing in one more,” said Mehul Malik, now a postdoctoral researcher at the University of Vienna. That means each quantum bit, or “qubit,” could store a letter instead of a simple 1 or 0. Seeing a photon The group, led by Malik and Robert Boyd, a professor of optics and physics at the University of Rochester, was able to see a photon’s states directly. They measured the photon’s orbital angular momentum, which is how much the particles of light “twist” as they travel through space. Ordinarily, finding the quantum state of a photon requires a two-step process. First, scientists have to measure some property of the photon, such as its polarization or momentum. The measurements are performed on many copies of the quantum state of a photon. But that process sometimes introduces errors. To get rid of the errors, the scientists have to look at what results they got that are “disallowed” states ones that don’t follow the laws of physics. But the only way to find them is to search through all the results and discard the ones that are impossible. That eats up a lot of computing time and effort. This process is called quantum tomography. [The 9 Biggest Unsolved Mysteries in Physics] A light wave is a combination of an electric and magnetic field, each of which oscillates and makes a wave. Each wave moves in time with the other, and they are perpendicular to each other. A beam of light is made up of lots of these waves. Light can have what is called orbital angular momentum. In a beam with no orbital angular momentum, the peaks of the waves the electric ones, for example are lined up. A plane connecting these peaks will be flat. If the beam has orbital angular momentum, a plane connecting these peaks will make a spiral, helical pattern, because the light waves are offset from one another slightly as you go around the beam. To measure the state of the photons, scientists must “unravel” this helical shape of the waves in the beam. Measuring a photon’s quantum state The team first fired a laser through a piece of transparent polymer that refracted the light, “unraveling” the helix formed by the waves. The light then passed through special lenses and into a grating that makes many copies of the beam. After passing through the grating, the light is spread out to form a wider beam. After the beam is widened, it hits a device called a spatial light modulator. The modulator carries out the first measurement. The beam then reflects back in the same direction it came from and passes through a beam splitter. At that point, part of thebeam moves toward a slit, which makes a second measurement. [Twisted Physics: 7 Mind-Blowing Experiments] One of the two measurements is called “weak” and the other “strong.” By measuring two properties, the quantum state of the photons can be reconstructed without the lengthy error-correction calculations tomography requires. In quantum computers, the quantum state of the particle is what stores the qubit. For instance, a qubit can be stored in the photon’s polarization or its orbital-angular momentum, or both. Atoms can also store qubits, in their momenta or spins. Current quantum computers have only a few bits in them. Malik noted that the record is 14 qubits, using ions. Most of the time, ions or photons will only have acouple of bits they can store, as the states will be two-dimensional. Physicists use two-dimensional systems because that is what they can manipulate it would be very difficult to manipulate more than two dimensions, he said. Direct measurement, as opposed to tomography, should make it easier to measure the states of particles (photons, in this case). That would mean it is simpler to add more dimensions three, four or even as in this experiment, 27 and store more information. Mark Hillery, a professor of physics at Hunter College in New York, was skeptical that direct measurement would prove necessarily better than current techniques. “There is a controversy about weak measurements in particular, whether they really are useful or not,” Hillery wrote in an email to LiveScience. “To me, the main issue here is whether the technique they are using is better (more efficient) than quantum-state tomography for reconstructing the quantum state, and in the conclusion, they say they don’t really know.” Jeff Savail, a master’s candidate researcher at Canada’s Simon Fraser University, worked on a similar direct measurement problem in Boyd’s lab, and his work was cited in Malik’s study. In an email he said one of the more exciting implications is the “measurement problem.” That is, in quantum mechanical systems the question of why some measurements spoil quantum states while others don’t is a deeper philosophical question than it is about the quantum technologies themselves. “The direct measurement technique gives us a way to see right into the heart of the quantum state we’re dealing with,” he said. That doesn’t mean it’s not useful far from it. “There may also be applications in imaging, as knowing the wave function of the image, rather than the square, can be quite useful.” Malik agreed that more experiments are needed, but he still thinks the advantages might be in the relative speed direct measurement offers. “Tomography reduces errors, but the post-processing [calculations] can take hours,” he said.
<urn:uuid:fba7f4c1-bed2-4626-86ce-ab5ce69d57fb>
CC-MAIN-2023-06
https://mbtimetraveler.com/2014/01/29/physicists-see-27th-dimension-of-photons/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499934.48/warc/CC-MAIN-20230201112816-20230201142816-00848.warc.gz
en
0.946489
1,497
3.515625
4
The smaller computer parts get, the more efficient they become – that was the rule of thumb when it comes to computers and electronics in general. But these parts have reached a state where they can no longer get any smaller without losing the properties used to build machines like modern computers, becoming a barrier to technological advancement. Computers, and technological advances in general, are reaching this physical limit as processors, transistors, and other computer parts approach the size of an atom. Modern electronics have silicon-based transistors as small as 10nm, which is almost a hundred times smaller than the size of red blood cells in the human body. Even smaller than that, transistors quickly and easily begin to play with the laws of classical mechanics, since at that subatomic level classical properties on which modern computers are built are not maintained. This is where quantum mechanics comes into play. For the uninitiated, quantum mechanics is the study of subatomic particles such as electrons, neutrons, and protons. In contrast to the physical objects that surround us, particles on the subatomic scale behave differently. While bits, or binary digits, are the building blocks of classical computing, quantum computing uses much more efficient subatomic qubits for computation. Bits in classical computing can be either 0 or 1, basically an “on” or “off” switch for the transistor to either pass or block electrons. Qubits, on the other hand, can be any combination of 0 and 1. Imagine a glass of lemonade where the lemon juice is 1 and the water is 0. The glass of lemonade is a solution of lemon juice and water, and until the solution is distilled in a lab, there’s no way of telling what ratio you’re in. Qubits are like that. 1 and 0 both exist in some ratio in a qubit, and as with lab testing, they collapse to a stable state of either 1 or 0 only when the qubit is observed or measured, giving us an unequivocal result. This uncertainty of state is called quantum superposition. Aside from this uncertain state, qubits are also mathematically entangled with the qubits in their vicinity. This means that when measuring when a qubit collapses to a 1 or 0 state, the state of the neighboring qubit will be affected by the result. This property is known as quantum entanglement. Because of this entanglement, measuring one qubit can tell us what state the neighboring qubits are in. Quantum computers are built based on these two basic principles of quantum mechanics: superposition and entanglement. The Nobel Prize-winning American physicist Richard Feynman first realized while working on one of his projects that classical computers are not scalable to handle complicated, especially quantum, simulations. He added that the two principles of quantum mechanics could be used to build a much better and more efficient computing system. In 1986, Feynman introduced the early version of the quantum circuit notation, based on which Peter Shor developed a quantum algorithm in 1994. Later, Isaac Chuang, Neil Gershenfeld, and Mark Kubinec developed the world’s first known working quantum computing tool using just two qubits in 1991. Although it was a very early rendition of a primitive computing device, it was quite a leap in the advancement of this nascent technology . Quantum computers are computing devices that control the behavior of particles at the subatomic level. Because the components and building blocks of quantum computers are orders of magnitude smaller than those of classical computers, they are exponentially faster and use only a fraction of the power that conventional computers require. However, contrary to how they are portrayed in the sci-fi genre, quantum computers are not an upgrade of the classic computers we have in our homes. That’s because they work very differently than the computers we have now. They are also exponentially better at complex calculations than the supercomputers that most technology companies like Google, IBM and Microsoft use for their R&D. Comparing classical computers and supercomputers with quantum computers would be like comparing bicycles with motorcycles. Classic computer upgrades often refer to multiplying capacity or efficiency. A decade ago, 1GB of RAM was enough for a PC. But now the 2GB RAM is the bare minimum in modern computers, that’s two 1GB RAM bundled together. Unlike the RAM in classic computers, no matter how many bikes are bundled together, they cannot become a motorcycle, as motorcycles are much more efficient and function differently than bicycles. The same applies to quantum computers, since they are fundamentally different from conventional computers. That’s why physicists and researchers behind this technology insist that quantum computers are not an upgrade from supercomputers, but an entirely different superclass of computers that will change the course of computing algorithms for the future. These computing devices are so advanced that they take a fraction of the time and energy to solve a problem that even modern supercomputers take hours to solve. A simple example would be how efficient they are at a database search. For example, if there is a database with 1 trillion names and a search is performed, classical computers and supercomputers compare every single name in the database to the search, which means a trillion operations for just a simple search. On the other hand, using the properties of qubits, a quantum computer can perform the same operation in significantly fewer steps. For the same search operation with 1 trillion names, quantum computers would only need to perform 1 million operations, which is a million times fewer operations than classical or supercomputers would require for the results. What supercomputers can do, quantum computers can do with a fraction of the resources. However, progress in this technology has been slow. Although companies like IBM, Google and Microsoft have invested heavily in the development of quantum computing tools in recent years, we are nowhere near a full-fledged prototype for commercial or personal use. News of prototypes from several Chinese and American researchers break out every few years. Still, we came closest to a quantum computer when Google AI partnered with NASA in October 2019. They claimed to have performed calculations at a quantum level that is seemingly infeasible on any classical or supercomputer. But even this claim is questioned by many. Of course, the commercial and private use of quantum computing is a dream for the distant future, especially since exploiting the quantum properties of particles at the subatomic level, unlike the classical computing components we use, can only be possible in a controlled environment. However, in a decade or two, primitive quantum computing tools could feed into various research and simulations that will give us a closer look at atoms and molecular structures. This level of intricate insight and powerful calculations will help the medical and nutritional industries better understand the elements. Any industry or branch that relies on research and simulation would benefit greatly from this hyper-efficient computing technology. These include space exploration, manufacturing, engineering, molecular analysis, cryptography, chemical engineering, etc. Cybersecurity or encryption is another sector where quantum computing will break the norm and revolutionize it. Thanks to the quantum uncertainty of qubits, deciphering the encryption from a quantum computer would be nearly impossible.
<urn:uuid:c47ece87-d9e1-4608-8489-4c5d3514d1cf>
CC-MAIN-2023-06
https://lenoxledger.com/2022/09/17/inevitable-rise-or-distant-dream/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499826.71/warc/CC-MAIN-20230130165437-20230130195437-00289.warc.gz
en
0.943362
1,469
3.984375
4
Taking a Practical Step Forward in Optical Computing Using Slow Light: Photonic Crystals Offer a Slow Light Solution for Optical Computing Previously published on Apr 13, 2011 Quantum computing is the Mount Everest of the information technology revolution. What approach succeeds will almost assuredly utilize optical components. With the limits of traditional electronics threatening to halt progress, alternatives, such as optical computing, will be needed in the not so distant future. One major hurdle for the development of such optical systems has been the need to convert between optical and electronic signals. Because time spent converting optical data into an electronic format takes longer than simply using the traditional medium, the concept is impractical in many respects. On the other hand, an almost paradoxical concept known as slow light offers a way around this barrier with a very practical solution. It is a fundamental law of the universe that light can only exist at the speed of light. That is, photons must always move at approximately 300 million meters per second. Looking closely at this law reveals a rather obvious loophole. Light waves passing through almost any given medium usually take longer to propagate through said medium than they would free space, because the light is bent along a lengthier path due to the internal properties of the medium. In other words, photons will continue to move at light speed, but it takes them longer to navigate through an object rather than simply moving within a vacuum at light speed, i.e. light goes slower. Consequently, given the proper medium, light could be slowed to a crawl, or even stopped. It is how much a medium bends light that determines the "speed" of light and this property classically depends upon a material's index of refraction. A material with a high enough index of refraction, therefore, could be used to slow light. While the first demonstration of slow light in 1999, which yielded a speed around 17 meters per second, utilized a Bose-Einstein Condensate, which is a low-temperature state of matter where the atoms lose their individual characteristics and act almost as a single particle, one alternative approach is to utilize the many emerging manmade meta-materials that exhibit extreme properties, including super high indexes of refraction. On the other hand, researchers at the University of Sydney in New South Wales looked at advances in photonic crystals to suggest an even easier, more dynamic alternative. Photonic crystals are a rapidly advancing technology first developed in the 1990's. By engineering regular structures in an optical material, light will respond to the pattern as though it is passing through a crystal. Giving researchers far greater control over light, photonic crystals can be used to slow light to variable speeds at continually shrinking costs with greater precision and less bulk. In fact, Professor Benjamin Eggleton's research group has already demonstrated an approach using a photonic crystal structure engineered by a University of St. Andrews team led by Professor Thomas F. Krauss for use over a broad bandwidth yields a sixteen fold increase in processing speeds over a traditional silicon chip, or 640 gigabits a second. As such, it is obvious the next step forward is hybrid systems using photonic crystal chips. The key to processing and transmitting data stems from the ability to control how information flows. Light can get information to where it needs to go rather quickly, but the information must be stored until it can be used. Optical buffering as the "old fashion" approach relies on costly conversions between optical and electronic signals, so slowing light is a better option. If light is slowed or stopped until it is needed, a hybrid optical-electronic system would be extremely practical with results instantly surpassing the capacity of electronic devices. Consequently, we may soon see a major advancement in the telecommunications industry, followed by a renewed revolution in all computing technologies. Thanks to initiatives for promoting civil investments in solar energy, LED lighting, national security and so on, technologies based on research from the fields of optics have known great progress in recent years. Just as the fruits of this research finally start to ripen, however, public support is drying up due to budget battles in Europe and the United States. Meanwhile, private funding can often be very selective to our civilization's detriment as entrepreneurs only want to invest in products that guarantee them a return, especially in the current environment where high return, low cost business deals can be exploited by the investment community. The US was already significantly behind in providing funds for research while even less funding is certain to retard progress just as we are the verge of major advances on a number of fronts. With relatively low-cost experimental needs, the optical sciences offer solutions for everything from national and energy security to pharmaceutical and agricultural applications. Breakthroughs like slow light, meta-materials, photonic crystals, and quantum dots, which are essentially "traps" for photons and other particles, came about due to somewhat basic theories of some very complex subjects and scientists simply questioning. Not only do these discoveries and more have a myriad of potential applications, the costs associated with these technologies fall as we see progress while the benefits and profits begin to amass. Pursuing related research has already revealed some very meaningful discoveries and opportunities, but our society must be more aggressive in our pursuit of the basic research required to realize current and future gains.
<urn:uuid:bfae07dd-a401-4330-8b68-6ccef283f8f3>
CC-MAIN-2023-06
https://www.washingtonoutsider.org/taking-a-practical-step-forward-in-optical-computing-using-slow-light.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499816.79/warc/CC-MAIN-20230130101912-20230130131912-00049.warc.gz
en
0.942878
1,071
3.640625
4
|The diamond in the center measures 1mm by 1mm.| This problem has now been resolved. A paper just published in Nature claims that they have found a way to protect multiple qubits from decoherence over an extended period of time, and built a quantum computer into a diamond to prove it. Since I realize most of my readers don't keep up with this field as much as I do, I'll try my best to explain. How it works. Classical computers use bits (binary digits) to store information in memory. These are the binary digits (1s and 0s) that get stored in between calculation steps in any non-trivial program. If a 1 were to turn into 0 randomly in the middle of an operation, a program might still be able to recover if it has good error detection, but if huge numbers of bits were to switch from one to the other while a program was running, there's just no way it could continue to function as desired. Quantum computers use qubits instead of bits. This is what makes quantum computing as a concept so very powerful. Rather than use 1s and 0s that are either one or the other, the qubits of quantum computing utilize a superposition of states where a single qubit might be partly a 1 and partly a 0. Yet this reliance upon qubits has been a fundamental problem in quantum computing so far. It is just too easy for qubits to decohere and lose whatever information you try to store in it. It wasn't until 2008 that a group first figured out how to keep a qubit from losing its information for a grand total of 1.75 seconds. Not only is this amount of time too short to do anything with, but the process could not handle multiple qubits, making this type of quantum computer incapable of using more than one qubit at a time. Today's news marks the first time anyone has figured out how to shield multiple qubits from decohering for an extended period of time. I won't get into details of how they accomplish this; you can read the paper yourself for that. (Suffice it to say that they used microwave pulses to delay decoherence continuously.) But the point is that they were able to construct a working quantum computer and run a simple program on it to verify the qubits are not decohering. The fun part of the story. |Quantum circuit representation of Grover's algorithm.| The usual explanation of Grover's algorithm starts by imagining a phone book organized in alphabetical order. (We'll need a new mental example soon; I can't remember the last time I actually saw a phone book in person.) If we have a specific phone number we're looking for, our only real recourse is to search through the book one entry at a time. We might get lucky and find it as soon as we open phone book, but we also might be unlucky and not find it until the very last entry. (Technically the second to last, since we assume the number exists in the book, but please ignore this sentence for clarity.) In general, it turns out that, on average, we will find the number we seek on our N/2nd try, where N is the total number of entries in the phone book. For a book with four entries, we'll find it on our second try on average; if the book has a hundred entries, we'll find it on the fiftieth try on average. A phone book listing everyone in New York City would have ~8,000,000 entries; we'd find a particular entry on the four millionth try on average. Grover's algorithm, on the other hand, will find it, on average, on our O(N1/2) try. For a book of four entries, we'll basically find it on the first try every time. With a hundred entries, we'll find it on the tenth try on average. With 8,000,000 entries, on average, we'll find it on the 2,829 try. That's not a typo; it really is less than three thousand tries for an eight million entry database. This speed increase is enormous. The applications for such sheer speed has drastic repercussions in the real world. The team used Grover's algorithm on their quantum computer and found the correct entry on their first try 95% of the time. This puts the question of whether their computer works or not entirely to rest. Their quantum computer not only works, but works well enough to actually be capable of quantum calculation at a 5% error rate. Sure, that's not perfect, but it's already enough accuracy that, if one desired, error-correcting could be done. Just redo the calculation ten times; with a 95% accuracy rate, that is more than enough to determine the correct output. The insane speed increase quantum computers have over classical computers is enough to make it more than worthwhile to repeat the same calculation ten times in a row each time. Of course, the quantum computer they built held only two qubits, and so can only store so much information at a time. Doing Grover's algorithm on an 8,000,000 entry database, for example, would require seven qubits. (In general, O(log(N)) qubits would be required for a database of N entries.) But there is nothing stopping someone from creating such a quantum computer in principle, so long as they have enough microwave pulses to keep them all from decohering. The future, it appears, is now. So what does this all mean? First of all, it means that somebody has a working two qubit quantum computer right now. Realistically, this is not enough to cause much of a ruckus. To put it in context, this is only enough storage to solve Grover's algorithm for databases of seven entries or less. (Although Grover's original paper points out that his algorithm requires only that the processing be done with qubits; the memory can be saved in a classical bits.) However, quantum computers do not scale linearly like classical computers do. As mentioned previously, a mere seven qubits would be enough memory to store steps for Grover's algorithm on a database of 8,000,000 entries. It takes very few qubits to handle very large problems. Now that the conceptual hurdle has been passed, someone could, right this very moment, build a quantum computer with a dozen qubits. There is nothing preventing us from accomplishing such a feat now, save sheer expense. But if you can afford the dozen microwave lasers and have an appropriately flawed diamond to work with, it is certainly doable. What could one do with such a machine? To put it bluntly, nearly everything. |All bounded error, quantum, polynomial time (BQP)| problems are efficiently solved by quantum computers. This effectively breaks most commercial-grade encryption now in use on the internet. Some encryption does survive; notably lattice-based and McEliece cryptosystems. Although the last section makes it seem like quantum computers have now arrived, there are still problems that need to be addresses. David DiVincenzo points out that a practical quantum computer must have: - scalable physically to increase the number of qubits; - qubits can be initialized to arbitrary values; - quantum gates faster than decoherence time; - universal gate set; - easily read qubits. This new discovery solves the first (albeit expensively) and third issues completely. The second issue is still problematic, but is something that can be programmed around. The fifth issue is a matter of convenience; expense and repeatability makes this solvable with money alone. This leaves only the fourth issue: a universal gate set. As this is not yet solved, we will not yet be able to program whatever we want on a quantum computer. But we can still run Grover's algorithm, and a few other programs to which we know the necessary gates, and I've already shown how this affects society directly. A note on how the press is covering this. As a skeptic, I was very disappointed in a quote by Daniel Lidar on their method of delaying decoherence. He told press "it’s a little like time travel", because the microwave pulses that made the electron qubit switch their direction of rotation did so in a way that makes the decoherence disappear by moving back in the direction it came from. But, quite frankly, this is bullshit. That has no more to do with time travel than moving left does when you want to take back a move to the right. However, now that the quote is out there in a story that headlines with the word "quantum", you can be sure lots of quacks will completely misunderstand, as they so often do. A few credits. Although credit for stuff like this gets cited in journals, blogs rarely take the time to actually link out to the individual scientists' blogs/social media in articles like this. So I thought I'd buck the trend by giving a shout out to the authors of the study, including Daniel Lidar, Zhihui Wang, Toeno van der Sar, Machiel Blok, Hannes Bernien, Tim Taminiau, David Toyli, David Awschalom, & Ronald Hanson. Well done. Also, thanks to Wikipedia for its help in understanding basic principles and the University of South California for their press release summarizing the findings of the paper. And a tip of the hat to the Skeptic's Guide to the Universe for letting me know about this recent development in the first place. Your podcast is awesome. (c:
<urn:uuid:ca4e1558-739d-44a3-8ebe-a5373a4b86a6>
CC-MAIN-2023-06
https://www.ericherboso.org/2012/04/working-quantum-computer-now-running.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499953.47/warc/CC-MAIN-20230201211725-20230202001725-00209.warc.gz
en
0.953623
2,013
3.875
4
We always have interesting MSc/BSc/internship projects for enthusiastic students, show teachers cutting-edge experiments to teachers, etcetera, see opportunities! Light consists of single photons, and the most classical type of light, coherent laser light, actually consists of a large mix of so-called photon number states - packets containing a certain number of photons. This fact makes it very hard to transform laser light into single photons, which are an important resource for photonic quantum information from quantum key distribution to quantum computing. High-quality sources of single photons were only recently demonstrated by several groups including us. In this paper in PRL, we demonstrate that we can turn it around: based on such a high-quality source of single photons, and quantum interference in an optical setup, we are able to engineer artificial states of light that are very similar to coherent laser light, literally, we are making “light from scratch”! There are subtle differences between textbook coherent states and our artificial coherent states, for instance, our artificial light contains some quantum entanglement between photons at different times, in fact it contains so-called photonic cluster states that are also very useful for quantum information, see QLUSTER. In the figure we compare the Wigner functions where quantum entanglement shows up as negative values (the white curves indicate a value of zero). Apart from that, we see that the overlap with perfect coherent states is pretty good! We are happy to announce the start of the ERC-funded QLUSTER project, where we aim to use semiconductor quantum dots in micro-cavities to produce many quantum-entangled photons in form of cluster and graph states! Read more on https://qluster.eu All light that we see consists of photons, however, single photons itself show fascinating different properties that enable, for instance, 100% secure communications in quantum cryptography or superresolution in microscopy. But making single photons is not an easy task. In one approach, the “photon blockade” effect, a conventional laser beam is sent to an optical cavity with a single atom (or an artificial atom aka quantum dot). Quantum effects in this device make it possible that only one photon at a time exits the device, like a turnstile for single photons. This device, however, requires very special properties and is extremely hard to fabricate. Now, we have confirmed for the first time experimentally that this can be done much easier: by using the “unconventional photon blockade” effect. This was theoretically conceived by our co-authors Vincenzo Savona and Hugo Flayac from the EPFL Lausanne. In short, we exploit the polarization property of the photons, and by cleverly using quantum interference of different polarizations, we obtain the same as in the photon blockade: a nice stream of single photons. The “unconventional photon blockade” might be very useful for future single photon sources and gives insight in the exciting photon number dependent physics of such devices. Editors suggestion: A Double Take on Unconventional Photon Blockade", Article: Phys. Rev. Lett. 121, 043601 (2018), pre-print Arxiv:1803.10992. Cyril Vaneph and colleagues at the University of Paris-Sud have at the same time demonstrated the effect in the microwave regime. An ordered stream of single photons is fundamentally different from conventional light which contains bunches of a random number of photons. Sources of such single-photon light are essential for emerging quantum technologies such as quantum cryptography or computing, but widespread use of recently developed bright semiconductor quantum-dot based single-photon sources was hindered by the need for complex optical setups. Here we show a fiber-integrated source of high-quality single photons; integration with conventional optical fiber technology will enable broad use in quantum photonics but it also might enable a number of new fundamental studies in various fields from microscopy to quantum metrology by reducing the experimental complexity significantly. With a single semiconductor quantum dot in a polarization degenerate microcavity, operating in the weak coupling regime, we can transform incident coherent light into a stream of strongly correlated photons with a second-order correlation function g2(0) up to 40! This work is published in Nat. Commun. 7, 12578 (2016). See cqed for more details. See 4photonoam for more details. Polarization vortices are singular points in generic light fields, different but closely connected to phase vortices. Their singular nature makes them ideal as a positional marker of light beams, we investigated this by studying how they behave while being reflected at an interface. Apart from the scientific article here that is on the cover of issue 8, you can also find a labtalk article with a bit more background, and see here for more information. The spatial structure of light can be either pure or in an incoherent superposition, a statistical mixture. A laserpointer reflected from a rough surface shows “speckle”, and if you move the surface very quickly these speckle appear washed out; this is light with reduced spatial coherence. We studied how beam shifts depend on this by theory and experiment, click here for more information. “Spin-Orbit Interaction for Light and Matter Waves” is a workshop which we have organized at the MPIPKS in Dresden (germany) from 15.-19. April 2013. More information here. The workshop was a great success!
<urn:uuid:2fe0ac8c-de97-430a-a150-a487ec48018e>
CC-MAIN-2023-06
https://quphotonics.org/lab/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499826.71/warc/CC-MAIN-20230130165437-20230130195437-00290.warc.gz
en
0.9282
1,144
3.640625
4
What is Advanced Materials Science and Technology Playing text to speech You’ve heard of materials science and engineering, but what are advanced materials science and technology? Advanced materials science is the study of the development, characterization, and production of next-generation materials. These are materials that have been designed to have improved properties or performance compared to existing materials. The field of advanced materials science covers a wide range of topics, from nanotechnology to quantum computing. In this blog post, we will explore what advanced materials science is and some of the different technologies involved in this cutting-edge field. What are Advanced Materials? Advanced materials are those that exhibit superior properties compared to conventional materials. They often have unique combinations of physical and chemical properties that make them suitable for a wide range of applications. There are many different types of advanced materials, including: • Ceramics: These are inorganic materials that have been used for centuries in a variety of applications, from pottery to engineering. Today, ceramics are used in everything from medical implants to solar panels. • Composite materials: These are made by combining two or more dissimilar materials to create a material with improved properties. For example, fiber-reinforced composites are strong and lightweight, making them ideal for use in aircraft and sporting equipment. • Nanomaterials: These are materials that measure just a few nanometers (billionths of a meter) in size. Due to their small size, nanomaterials often have unique physical and chemical properties that make them useful in a range of applications, from sensors to computer chips. • Metals: Metals have been used by humans for thousands of years and continue to play an important role in society today. In recent years, advances in metallurgy have led to the development of new types of metals with improved properties, such as corrosion resistance and strength. Properties of Advanced Materials There are many different types of advanced materials, each with their own unique properties. Some common examples include: -Graphene: A one-atom-thick layer of carbon that is incredibly strong and flexible. -Nanocrystalline materials: Materials with extremely small grains that have enhanced strength and toughness. -Metamaterials: Engineered materials with unusual properties, such as the ability to bend light in unprecedented ways. -Shape memory alloys: Alloys that can be deformed at high temperatures and then returned to their original shape upon cooling. Each of these materials has the potential to revolutionize various industries, from construction and transportation to electronics and computing. Classification of Advanced Materials - There are three primary categories of advanced materials: metals, polymers, and ceramics. Each of these categories can be further subdivided into subcategories. - Metals: Metallic alloys are created by combining two or more metallic elements. They can be categorized based on their microstructure, which is the way the atoms are arranged within the metal. Common types of microstructures include grain boundary alloys, nanocrystalline alloys, and amorphous alloys. - Polymers: Polymers are long chains of molecules that can be classified based on their chemical structure. Common types of polymers include thermoplastic polymers, thermosetting polymers, elastomers, and biopolymers. - Ceramics: Ceramics are inorganic materials that are made up of non-metallic elements. They can be either crystalline or non-crystalline in nature. Common types of ceramics include oxide ceramics, nitride ceramics, carbide ceramics, and halide ceramics. Some examples of Advanced Materials - Advanced materials science and technology is a relatively new field that encompasses a wide range of material types, including nanomaterials, quantum materials, metallic glasses, and more. Researchers in this field are working to develop new materials with improved properties and performance characteristics. - One example of an advanced material is carbon nanotubes. Carbon nanotubes are extremely strong and lightweight, making them ideal for use in a variety of applications, including energy storage, construction, and manufacturing. Another example is graphene, a one-atom-thick layer of carbon that is extremely strong and conductive. Graphene has potential applications in electronics, sensors, batteries, and more. - Researchers are also working on developing new methods for 3D printing with advanced materials. This technology could be used to create customized parts and products with complex shapes and structures. Additionally, 3D printing with advanced materials could be used to create scaffolds for tissue regeneration or to fabricate medical devices. How is Advanced Materials Science and Technology used? - Advanced materials science and technology is used to develop new materials with improved properties. These improved properties can be due to the material’s composition, structure, or both. For example, advanced materials can be developed to be stronger and lighter than traditional materials, more heat resistant, more chemically resistant, or have other improved performance characteristics. - Developing new advanced materials requires a deep understanding of the relationships between the material’s composition, structure, and properties. This understanding is typically gained through research at the atomic and molecular level. Once this understanding is attained, scientists and engineers can use this knowledge to design and synthesize new materials with specific desired properties. - The development of advanced materials is an important area of research and development as these new materials can enable advances in many different fields. For example, stronger and lighter materials can lead to advances in transportation; more heat resistant materials can enable advances in energy production; and more chemically resistant materials can facilitate advances in environmental cleanup technologies. Future of Advanced Materials Science and Technology - Advanced materials science and technology is an interdisciplinary field that applies the properties of matter to create new materials with superior performance. It is a rapidly growing field with immense potential for transforming the way we live and work. - The future of advanced materials science and technology is immensely bright. The field is constantly evolving, and new breakthroughs are being made all the time. We are on the cusp of major advances in many different areas, from developing stronger, lighter and more sustainable materials to creating new medical technologies and improving energy storage. - There are endless possibilities for what we can achieve with advanced materials science and technology. In the coming years, we will continue to push the boundaries of what is possible, making everyday items smarter, stronger and more sustainable. We will also develop new technologies that have the potential to change the world, from medical treatments that can save lives to energy sources that are cleaner and more efficient. Advanced materials science and technology is an exciting and growing field that promises to revolutionize the way we live and work. From stronger and lighter building materials to more efficient solar cells, advanced materials have the potential to transform our world. If you're interested in learning more about this field, be sure to check out our website for more information. Thanks for reading!
<urn:uuid:0d16cefd-9a20-48c7-b53d-d3477064cb6f>
CC-MAIN-2023-06
https://yourviews.mindstick.com/view/84331/what-is-advanced-materials-science-and-technology
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499829.29/warc/CC-MAIN-20230130201044-20230130231044-00410.warc.gz
en
0.929498
1,445
3.734375
4
Quantum Machine Learning: An Overview Quantum Machine Learning (Quantum ML) is the interdisciplinary area combining Quantum Physics and Machine Learning(ML). It is a symbiotic association- leveraging the power of Quantum Computing to produce quantum versions of ML algorithms, and applying classical ML algorithms to analyze quantum systems. Read this article for an introduction to Quantum ML. At a recent conference in 2017, Microsoft CEO Satya Nadella used the analogy of a corn maze to explain the difference in approach between a classical computer and a quantum computer. In trying to find a path through the maze, a classical computer would start down a path, hit an obstruction, backtrack; start again, hit another obstruction, backtrack again until it ran out of options. Although an answer can be found, this approach could be very time-consuming. In contrast, quantum computers “unlock amazing parallelism. They take every path in the corn maze simultaneously.” Thus, leading to an exponential reduction in the number of steps required to solve a problem. The parallelism comes from the concept of ‘qubit’, 'superposition' and 'entanglement' derived from Quantum Physics. I. Quantum Computing: Quantum is the smallest possible unit of any physical entity, such as energy or mass. In 1900, Max Planck proposed that, at the atomic and subatomic level, energy of a body is contained in discrete packets called quanta’. Wave-particle duality is the characteristic of quantic particles to behave as a wave sometimes and as a particle the other times, depending on the environment. Quantum theory is characterized by finding the probability of, and not the exact location of, a particle at a given point x in space. Fig 1: The dual nature of light, which acts like both particles and waves. (Source) A classical computer performs operations using classical ‘bits’, which are either 0 OR 1. However, a quantum computer uses quantum bits, also called ‘qubits’ to perform operations. Qubits can be represented by: - An electron orbiting a nucleus: where |1> and |0> are the excited state and ground state respectively - A photon: where |1> and |0> are polarizations of the photon. Qubits exist as both 0 AND 1 at the same time. This phenomenon is called ‘superposition’. Although a particle can exist in multiple quantum states, once we measure that particle for its energy or position, its superposition is lost and it then exists in only one state. Fig2 : The qubit is defined as a pair of complex vectors pointing to a spot on a unit sphere. Traditionally, a qubit pointing directly up (positive on the axis) is denoted as the column vector |0⟩ and the vector pointing down is known as |1⟩. (For example, in this case, the qubit is |0⟩). ‘Quantum entanglement’ is the phenomenon in which quantum particles interact with each other and are described with reference to each other, not independently, even if the particles are separated by a large distance. At the time of measurement, if one entangled particle in a pair is decided to be in the spin state of ‘down’ (that is, the lowest energy state; when the electron is in alignment with its magnetic field), then this decision is communicated to the other correlated particle that now assumes the opposite spin state of ‘up’. Quantum entanglement allows qubits, including those faraway, to interact instantaneously with each other. How does Quantum computing unlock immense parallelism? Two interacting classical bits can take one of 4 forms: 00 or 01 or 10 or 11. Each of these 2 components of information- the first bit and the second bit, combine to represent only one binary configuration at a given time. Adding more bits to a regular computer would still represent a single binary configuration. Fig3: One qubit in superposition before measurement, with its probabilities of ‘spin-up’ AND ‘spin-down'. (Source) One qubit can exist in both states (0 AND 1) at once. Thus, two interacting qubits can store all 4 binary configurations simultaneously. In general, ‘n’ qubits can simultaneously represent ‘2^n’ classical binary configurations. Thus, a 300–qubit quantum computer can explore 2^300 possible solutions at the same time, unlike 1 solution at a time in a classical computer, causing immense parallelism. Adding more qubits to a quantum computer would exponentially increase the power of the computer. A fully quantum computer has not yet been realized because adding more qubits and dealing with subatomic particles that require a low temperature of -452 F in order to be stable, is daunting and building a computer around that (a ‘quantum computer’), even more so. Thus, efforts are on to ‘simulate’ 40 qubit operations using Microsoft’s quantum simulator- LIQUi|> , extended by Microsoft Azure’s cloud computing resources. Quantum Computing can solve specialized scientific problems such as molecular modelling, creation of high-temperature superconductors, drug modelling and testing, selection of molecules for the creation of organic batteries. It is not optimal for general-purpose tasks such as for watching videos or writing a Word document. Now, how does Quantum Computing fit in with Machine Learning? II. Quantum ML: 2a) Quantum versions of ML algorithms - Finding eigenvalues and eigenvectors of large matrices: One of the methods to perform the classical PCA algorithm is to take the eigenvalue decomposition of a data covariance matrix. However, this is not so efficient in case of high dimensional data. Quantum PCA of an unknown low-rank density matrix, can reveal the quantum eigenvectors associated with the large eigenvalues, exponentially faster than a linearly-scaled classical algorithm. - Finding nearest neighbours on a quantum computer: The quantum algorithms presented here for computing nearest neighbours that are used in supervised and unsupervised learning, place an upper bound on the number of queries to the input data required to compute distance metrics such as the Euclidean distance and inner product. The best cases show exponential and super-exponential reductions in query complexity and the worst case still shows polynomial reduction in query complexity over the classical analogue.
<urn:uuid:b8be3abe-8217-4343-a15b-5eccc24f6938>
CC-MAIN-2023-06
https://www.kdnuggets.com/2018/01/quantum-machine-learning-overview.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499710.49/warc/CC-MAIN-20230129080341-20230129110341-00092.warc.gz
en
0.911218
1,386
3.5625
4
Researchers have taken an important step toward the long-sought goal of a quantum computer, which in theory should be capable of vastly faster computations than conventional computers, for certain kinds of problems. The new work shows that collections of ultracold molecules can retain the information stored in them, for hundreds of times longer than researchers have previously achieved in these materials. These two-atom molecules are made of sodium and potassium and were cooled to temperatures just a few ten-millionths of a degree above absolute zero (measured in hundreds of nanokelvins, or nK). The results are described in a report this week in Science, by Martin Zwierlein, an MIT professor of physics and a principal investigator in MIT’s Research Laboratory of Electronics; Jee Woo Park, a former MIT graduate student; Sebastian Will, a former research scientist at MIT and now an assistant professor at Columbia University, and two others, all at the MIT-Harvard Center for Ultracold Atoms. Many different approaches are being studied as possible ways of creating qubits, the basic building blocks of long-theorized but not yet fully realized quantum computers. Researchers have tried using superconducting materials, ions held in ion traps, or individual neutral atoms, as well as molecules of varying complexity. The new approach uses a cluster of very simple molecules made of just two atoms. Using this kind of two-atom molecules for quantum information processing “had been suggested some time ago,” says Park, “and this work demonstrates the first experimental step toward realizing this new platform, which is that quantum information can be stored in dipolar molecules for extended times.” This vacuum chamber with apertures for several laser beams was used to cool molecules of sodium-potassium down to temperatures of a few hundred nanoKelvins, or billionths of a degree above absolute zero. Such molecules could be used as a new kind of qubit, a building block for eventual quantum computers. Courtesy of the researchers “The most amazing thing is that [these] molecules are a system which may allow realizing both storage and processing of quantum information, using the very same physical system,” Will says. “That is actually a pretty rare feature that is not typical at all among the qubit systems that are mostly considered today.” In the team’s initial proof-of-principle lab tests, a few thousand of the simple molecules were contained in a microscopic puff of gas, trapped at the intersection of two laser beams and cooled to ultracold temperatures of about 300 nanokelvins. “The more atoms you have in a molecule the harder it gets to cool them,” Zwierlein says, so they chose this simple two-atom structure. The molecules have three key characteristics: rotation, vibration, and the spin direction of the nuclei of the two individual atoms. For these experiments, the researchers got the molecules under perfect control in terms of all three characteristics — that is, into the lowest state of vibration, rotation, and nuclear spin alignment. “We have strong hopes that we can do one so-called gate — that’s an operation between two of these qubits, like addition, subtraction, or that sort of equivalent — in a fraction of a millisecond,” Zwierlein says. “If you look at the ratio, you could hope to do 10,000 to 100,000 gate operations in the time that we have the coherence in the sample. That has been stated as one of the requirements for a quantum computer, to have that sort of ratio of gate operations to coherence times.” “The next great goal will be to ‘talk’ to individual molecules. Then we are really talking quantum information,” Will says. “If we can trap one molecule, we can trap two. And then we can think about implementing a ‘quantum gate operation’ — an elementary calculation — between two molecular qubits that sit next to each other,” he says. Using an array of perhaps 1,000 such molecules, Zwierlein says, would make it possible to carry out calculations so complex that no existing computer could even begin to check the possibilities. Though he stresses that this is still an early step and that such computers could be a decade or more away, in principle such a device could quickly solve currently intractable problems such as factoring very large numbers — a process whose difficulty forms the basis of today’s best encryption systems for financial transactions. Besides quantum computing, the new system also offers the potential for a new way of carrying out precision measurements and quantum chemistry, Zwierlein says. Extending the coherence time of molecules Quantum properties of atoms and molecules can be exploited for precision measurements or quantum information processing. The complex state structure of molecules can be exploited, but it is hard to preserve the coherence between pairs of those states in applications. Park et al. created fermionic molecules of NaK in the rovibrational ground state that maintained coherence between their nuclear spin states on a time scale of 1 second. This long coherence time makes dipolar ultracold molecules a valuable quantum resource. Coherence, the stability of the relative phase between quantum states, is central to quantum mechanics and its applications. For ultracold dipolar molecules at sub-microkelvin temperatures, internal states with robust coherence are predicted to offer rich prospects for quantum many-body physics and quantum information processing. We report the observation of stable coherence between nuclear spin states of ultracold fermionic sodium-potassium (NaK) molecules in the singlet rovibrational ground state. Ramsey spectroscopy reveals coherence times on the scale of 1 second; this enables high-resolution spectroscopy of the molecular gas. Collisional shifts are shown to be absent down to the 100-millihertz level. This work opens the door to the use of molecules as a versatile quantum memory and for precision measurements on dipolar quantum matter. Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology. Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels. A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
<urn:uuid:a2021157-c22f-4502-8bb8-41819c44264e>
CC-MAIN-2023-06
https://www.nextbigfuture.com/2017/07/atoms-cooled-to-300-nanokelvin-could-enable-powerful-quantum-computers.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500837.65/warc/CC-MAIN-20230208155417-20230208185417-00216.warc.gz
en
0.933012
1,414
3.9375
4
The speed of light is the rate at which light travels. The speed of light in a vacuum is a constant value that is denoted by the letter c and is defined as exactly 299,792,458 meters per second. Visible light, other electromagnetic radiation, gravity waves, and other massless particles travel at c. Matter, which has mass, can approach the speed of light, but never reach it. Value for the Speed of Light in Different Units Here are values for the speed of light in various units: - 299,792,458 meters per second (exact number) - 299,792 kilometers per second (rounded) - 3×108 m/s (rounded) - 186,000 miles per second (rounded) - 671,000,000 miles per hour (rounded) - 1,080,000,000 kilometers per hour (rounded) Is the Speed of Light Really Constant? The speed of light in a vacuum is a constant. However, scientists are exploring whether the speed of light has changed over time. Also, the rate at which light travels changes as it passes through a medium. The index of refraction describes this change. For example, the index of refraction of water is 1.333, which means light travels 1.333 times slower in water than in a vacuum. The index of refraction of a diamond is 2.417. A diamond slows the speed of light by more than half its speed in a vacuum. How to Measure the Speed of Light One way of measuring the speed of light uses great distances, such as distant points on the Earth or known distances between the Earth and astronomical objects. For example, you can measure the speed of light by measuring the time it takes for light to travel from a light source to a distant mirror and back again. The other way of measuring the speed of light is solving for c in equations. Now that the speed of light is defined, it is fixed rather than measured. Measuring the speed of light today indirectly measures the length of the meter, rather than c. In 1676, Danish astronomer Ole Rømer discovered light travels at a speed by studying the movement of Jupiter’s moon Io. Prior to this, it seemed light propagated instantaneously. For example, you see a lightning strike immediately, but don’t hear thunder until after the event. So, Rømer’s finding showed light takes time to travel, but scientists did not know the speed of light or whether it was constant. In 1865, James Clerk Maxwell proposed that light was an electromagnetic wave that travelled at a speed c. Albert Einstein suggested c was a constant and that it did not change according to the frame of reference of the observer or any motion of a light source. In other words, Einstein suggested the speed of light is invariant. Since then, numerous experiments have verified the invariance of c. Is It Possible to Go Faster Than Light? The upper speed limit for massless particles is c. Objects that have mass cannot travel at the speed of light or exceed it. Among other reasons, traveling at c gives an object a length of zero and infinite mass. Accelerating a mass to the speed of light requires infinite energy. Furthermore, energy, signals, and individual photos cannot travel faster than c. At first glance, quantum entanglement appears to transmit information faster than c. When two particles are entangled, changing the state of one particle instantaneously determines the state of the other particle, regardless of the distance between them. But, information cannot be transmitted instantaneously (faster than c) because it isn’t possible to control the initial quantum state of the particle when it is observed. However, faster-than-light speeds appear in physics. For example, the phase velocity of x-rays through glass often exceeds c. However, the information isn’t conveyed by the waves faster than the speed of light. Distant galaxies appear to move away from Earth faster than the speed of light (outside a distance called the Hubble sphere), but the motion isn’t due to the galaxies traveling through space. Instead, space itself it expanding. So again, no actual movement faster than c occurs. While it isn’t possible to go faster than the speed of light, it doesn’t necessarily mean warp drive or other faster-than-light travel is impossible. The key to going faster than the speed of light is to change space-time. Ways this might happen include tunneling using wormholes or stretching space-time into a “warp bubble” around a spacecraft. But, so far these theories don’t have practical applications. - Brillouin, L. (1960). Wave Propagation and Group Velocity. Academic Press. - Ellis, G.F.R.; Uzan, J.-P. (2005). “‘c’ is the speed of light, isn’t it?”. American Journal of Physics. 73 (3): 240–27. doi:10.1119/1.1819929 - Helmcke, J.; Riehle, F. (2001). “Physics behind the definition of the meter”. In Quinn, T.J.; Leschiutta, S.; Tavella, P. (eds.). Recent advances in metrology and fundamental constants. IOS Press. p. 453. ISBN 978-1-58603-167-1. - Newcomb, S. (1886). “The Velocity of Light”. Nature. 34 (863): 29–32. doi:10.1038/034029c0 - Uzan, J.-P. (2003). “The fundamental constants and their variation: observational status and theoretical motivations”. Reviews of Modern Physics. 75 (2): 403. doi:10.1103/RevModPhys.75.403
<urn:uuid:bef4891e-23a1-42bc-8257-15b06d7f2be8>
CC-MAIN-2023-06
https://sciencenotes.org/what-is-the-speed-of-light/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500339.37/warc/CC-MAIN-20230206113934-20230206143934-00295.warc.gz
en
0.897855
1,236
3.828125
4
Emerging technologies are innovative technical developments that have the potential to change the way we live and work. Some examples of emerging technologies include artificial intelligence, the Internet of Things, virtual and augmented reality, blockchain, gene editing, robotics, and quantum computing. These technologies are still in the early stages of development, but have the potential to revolutionize industries and change the world in significant ways. It is important to consider both the potential benefits and risks of emerging technologies as they continue to evolve and become more widely adopted. Emerging technologies are those technical innovations that represent progressive developments within a field for competitive advantage. These technologies are generally in the early stages of development and are not yet mature enough for commercial use, but have the potential to significantly change the way we live and work. One emerging technology that has garnered a lot of attention in recent years is artificial intelligence (AI). AI refers to the ability of machines to perform tasks that would normally require human-level intelligence, such as learning, problem-solving, and decision-making. There are many potential applications for AI, including automating repetitive and time-consuming tasks, improving decision-making processes, and enabling the development of new products and services. Another emerging technology is the Internet of Things (IoT), which refers to the interconnectedness of physical objects through the internet. This technology allows objects to be connected and controlled remotely, enabling them to send and receive data and perform actions based on that data. The IoT has the potential to revolutionize the way we live and work, with applications ranging from smart homes and cities to connected factories and agriculture. Virtual and augmented reality Virtual and augmented reality (VR and AR) are also emerging technologies that are gaining traction. VR involves the use of computer technology to create a simulated environment, while AR involves superimposing computer-generated images onto the real world. These technologies have a wide range of applications, including entertainment, education, and training. For example, VR and AR can be used to create immersive experiences, such as virtual field trips or simulated training scenarios. Blockchain technology is another emerging technology that has generated a lot of buzz in recent years. A blockchain is a decentralized, distributed ledger that records transactions on multiple computers, making it virtually impossible to alter or hack. This technology has the potential to revolutionize many industries, including finance, healthcare, and supply chain management. There are many other emerging technologies that have the potential to shape the future, including gene editing, robotics, and quantum computing. Gene editing involves making precise changes to the DNA of living organisms, with the potential to cure diseases, improve crop yields, and more. Robotics refers to the use of machines to perform tasks that would normally require human labor, and has applications in manufacturing, transportation, and more. Quantum computing involves the use of quantum-mechanical phenomena, such as superposition and entanglement, to perform calculations that would be impossible for classical computers. This technology has the potential to solve problems that would take classical computers years to solve in just minutes. Artificial intelligence (AI) refers to the ability of machines to perform tasks that would normally require human-level intelligence, such as learning, problem-solving, and decision-making. There are many potential applications for AI, including automating repetitive and time-consuming tasks, improving decision-making processes, and enabling the development of new products and services.AI can be classified into two main categories: narrow or general. Narrow AI is designed to perform a specific task, such as facial recognition or language translation. It is often referred to as "weak AI." General AI, on the other hand, is designed to be able to perform any intellectual task that a human can. It is often referred to as "strong AI." While narrow AI is currently more prevalent, the ultimate goal of AI research is to develop general AI. There are several approaches to developing AI, including machine learning and deep learning. Machine learning involves training algorithms on a large dataset and allowing them to improve their performance over time, without being explicitly programmed. Deep learning is a subset of machine learning that involves the use of artificial neural networks to learn and make decisions. AI has the potential to revolutionize many industries, including healthcare, information technology, transportation, finance, and manufacturing. It can also have a significant impact on society as a whole, both positive and negative. On the positive side, AI can be used to solve global challenges such as poverty, disease, and climate change. On the negative side, it could lead to job displacement and social and economic inequality. The development and use of AI raise ethical concerns, including issues around privacy, bias, and accountability. It is important for researchers and policymakers to consider these issues and work to address them as AI continues to evolve and become more prevalent in our lives. In conclusion, companies like Scrrum Labs have emerging technologies that have the potential to significantly impact the way we live and work. These technologies are still in the early stages of development and have not yet reached maturity, but they have the potential to revolutionize industries and change the world in ways we can't yet imagine.
<urn:uuid:6dfd1a05-c047-43a9-9a8b-4a05c46eb986>
CC-MAIN-2023-06
https://scrrum.com/blog/look-at-the-most-promising-emerging-technologies
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499758.83/warc/CC-MAIN-20230129180008-20230129210008-00457.warc.gz
en
0.956908
1,048
3.515625
4
Updated on January 16, 2023 1:42 PM Encryption standards play a vital role in securing blockchain ecosystems from malware and unusual authentication or attacks. Generally, Cryptographic algorithms are used to protect the data within a blockchain which only could be accessed via keys. These keys can be publicly shared or even act as private keys. In this section of basics, we are going to learn about Symmetric-key cryptography which is an algorithm for Blockchain encryption. Let’s dive deep into this. Symmetric-key Cryptography is a kind of encryption that uses a single key (a secret key) to encode and decode electronic data. The key must be exchanged between the organizations communicating using symmetric encryption so that it may be utilized in the decryption process. This encryption method is distinct from asymmetric encryption, which employs a pair of keys, one public and one private, to encrypt and decode messages. For example, we all have used Google docs (if not then use it once). The content we put on our doc file is only visible to us until we disable the sharing restrictions from the top right bar. When we opt for the “Share with anyone” feature, we get a link which then could be shared with a reliable person whom you wanted to get access to your doc file to see the content. Suppose the doc file is “Data or information” and the “link” is the key. This suffices the example of Symmetric-key cryptography. Encoding data in this manner has been widely utilized in earlier decades to permit covert communication between governments and armies. Symmetric-key cryptography is also known as shared-key cryptography, secret-key cryptography, single-key cryptography, one-key cryptography, and finally private-key cryptography. With this type of encryption, it is obvious that the key must be known by both the sender and the recipient. The distribution of the key is the source of the approach's intricacy. Symmetric encryption technique ”scrambles” the data so that it cannot be understood by anybody who does not have the secret key to decode it. Once the intended receiver who holds the key obtains the message, the algorithm reverses its activity such that the message is restored to its original readable form. The secret key used by both the sender and the receiver might be a specific password/code or a random string of letters or numbers created using a secure random number generator (RNG). Source: by David McNeal (TheCryptoWriter) | Medium The above diagram could be concluded as: The sender encrypts their information with an encryption key (often a string of letters and numbers). The encrypted information, known as ciphertext, appears as jumbled letters and is unreadable by anybody along the road. The decryption key is used by the receiver to convert the ciphertext back into readable text. The previous example shows that the information was accessed using the same key. The data can only be viewed and accessed by these two parties (sender and recipient). This is why it is also known as secret key cryptography, private key cryptography, symmetric cryptography, and symmetric key encryption. The use of a single key for both encryption and decryption streamlines the encryption process. After all, you're using a single key to convert readable plaintext into unreadable gibberish (ciphertext) and vice versa. One benefit of employing symmetric encryption is that it enables data privacy and confidentiality without the additional complexity of many keys. For some applications, symmetric key encryption works on its own. It's handy for encrypting databases and files, for example, when no data is being sent openly between parties. Symmetric Key cryptography is classified into two types: Let’s understand both separately. Block algorithms serve to secure electronic data blocks. While using the authorized private key, predefined set lengths of bits are changed. This key is then applied to every block. When encrypting network stream data, the encryption system stores the data in its memory components while waiting for the whole blocks. The amount of time the system waits can create a security hole and compromise data security and integrity. The solution to the problem comprises a method in which the data block can be lowered and merged with the contents of the preceding encrypted data block until the rest of the blocks emerge. This is referred to as feedback. Only once the complete block has been received it is then marked as encrypted. Stream algorithms are not stored in the memory of the encryption system but instead come in data stream algorithms. This approach is deemed slightly safer since a disk or system does not preserve data without encryption in the blocks. Some examples of Symmetric-Key Cryptography: AES (Advanced Encryption Standard) DES (Data Encryption Standard) IDEA (International Data Encryption Algorithm) Blowfish (Drop-in replacement for DES or IDEA) RC4 (Rivest Cipher 4) RC5 (Rivest Cipher 5) RC6 (Rivest Cipher 6) AES, DES, IDEA, Blowfish, RC5 and RC6 are block ciphers. RC4 is a stream cipher. Despite being an older kind of encryption, symmetric encryption is quicker and more effective than asymmetric encryption, which strains networks owing to performance concerns with data quantity and high CPU usage. Symmetric cryptography is generally used for bulk encryption / encrypting huge volumes of data, such as for database encryption, due to its greater performance and faster speed (relative to asymmetric encryption). The secret key could only be accessible to the database itself to encrypt or decrypt in the event of a database. Comparing existing standards for asymmetric algorithms to industry-standard symmetric encryption, one can see that the latter is less susceptible to developments in quantum computing (at the time of writing). Here are some instances in which symmetric cryptography is applied: Applications for making payments, like card transactions, require the protection of PII to stop identity theft and fraudulent charges. validations that a message's sender is who they say they are. Hashing or the random number generation. In addition to encryption, symmetric ciphers are frequently employed to accomplish other cryptographic primitives. A message cannot be guaranteed to remain intact while being encrypted. As a result, a message authentication code is frequently appended to a ciphertext to make sure that the recipient will be aware of modifications to the ciphertext. From an AEAD cipher, message authentication codes may be created (e.g. AES-GCM). However, without involving extra parties, symmetric ciphers cannot be employed for non-repudiation (the assurance that someone cannot deny the validity of something) reasons. Symmetric encryption uses a single key that must be shared with the individuals who need to receive the message. While, Asymmetrical encryption involves a pair, consisting of a public key and a private key, to encrypt and decode messages while communicating An old concept Relatively new concept In symmetric cryptography, data is encrypted or decrypted using a single shared key that both parties are aware of. Asymmetric encryption is developed to address the issue of the symmetric encryption model's requirement for key exchange by replacing the key with a pair of public-private keys. The execution time is much faster Slower execution time. The most recent technologies could sometimes be the ideal fit when it comes to encryption. In reality, as cryptography develops in a new way, new protocols are being built to keep up with would-be hackers and to safeguard the data to increase privacy. In the upcoming years, hackers will inevitably make things difficult for specialists, thus we can confidently anticipate new advancements from the cryptography community.
<urn:uuid:8c57f6fe-4d39-4a23-8d50-9b26d10335e0>
CC-MAIN-2023-06
https://themorningcrypto.com/article/symmetric-key-cryptography
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500151.93/warc/CC-MAIN-20230204173912-20230204203912-00737.warc.gz
en
0.926628
1,674
3.78125
4
On the 4th November 1964, the physicist John S. Bell published a paper called On the Einstein-Podolsky-Rosen paradox. This was an important paper for both philosophy and physics with implications for our understanding of reality and freedom. When quantum theory was developed in the early 20th century, the philosophical implications troubled some, including Einstein. The “Copenhagen interpretation” put realism in science under threat. Although the “macro” world (people, planets, plates and platypuses) were argued to be real existing things, electrons and other particles were held not to be. The world was therefore divided into the “classical” and the “quantum” worlds, or as John S. Bell later called them, the “speakable” and the “unspeakable”. In 1935, Einstein published a paper with Nathan Rosen and Boris Podolsky (known collectively as EPR) arguing that quantum mechanics was not a complete theory, but required additional “hidden” variables to preserve realism and locality. “In the vernacular of Einstein: locality meant no instantaneous (“spooky”) action at a distance; realism meant the moon is there even when not being observed.” (wiki) Bell also argued for realism, thus rejecting the Copenhagen Interpretation. He worked with realist theories such as de Broglie–Bohm theory, but the theory violated the EPR locality criterion. This fact was used to argue that it was on the wrong track, but Bell’s 1964 paper showed that “any serious version of quantum theory (regardless of whether or not it is based on microscopic realism) must violate locality. This means that if nature is governed by the predictions of quantum theory, the ‘locality principle’ is simply wrong, and our world is nonlocal” (American Scientist) Experiments have since been carried out demonstrating that nature does indeed follow the predictions of quantum theory in the required way. The “conclusion that there are hidden variables implies that, in some spin-correlation experiments, the measured quantum mechanical probabilities should satisfy particular inequalities (Bell-type inequalities). The paradox consists in the fact that quantum probabilities do not satisfy these inequalities. And this paradoxical fact has been confirmed by several laboratory experiments since the 1970s” (IEP). Thus Bell converted the EPR thought experiment into real experiments, albeit with results that Einstein would have disliked. It suggests that any quantum theory we arrive at will conflict with common sense. (It also has technical implications for technical advances such as quantum cryptography and quantum computing.) Later, Bell suggested a hypothesis which would resolve the “spooky action” problem without requiring faster-than-light information transfer: super-determinism. Super-determinism states “[t]hat not only is inanimate nature deterministic, but we, the experimenters who imagine we can choose to do one experiment rather than another, are also determined. If so, the difficulty which this experimental result creates disappears” (from The Ghost in the Atom, P.C.W. Davies and J. Brown, ch.3, p.47, quoted here) – in other words, free will is an illusion. Bell demonstrated that philosophy and physics can usefully interact. In the words of Tim Maudlin (“on the foundations of physics” in 3:am): In my view, the greatest philosopher of physics in the first half of the 20th century was Einstein and in the second half was John Stewart Bell. So physicists who say that professional philosophers have not made the greatest contributions to foundations of physics are correct. But both Einstein and Bell had philosophical temperaments, and Einstein explicitly complained about physicists who had no grounding in philosophy. The community of people who work in foundations of physics is about evenly divided between members of philosophy departments, members of physics departments and members of math departments. […] A more salient division in contemporary foundations is between those, like myself, who judge that Bell was basically correct in almost everything he wrote and those who think that his theorem does not show much of interest and his complaints about the unprofessional vagueness that infects quantum theory are misplaced. Bell’s essay “Against ‘measurement'” lists “system, apparatus, environment, microscopic, macroscopic, reversible, irreversible, observable, information, measurement” as terms ubiquitous in quantum theory that cannot be defined precisely. Without precision, concepts and theory cannot hope to be precise either. John S. Bell died of a stroke in 1990. At the time of his death he was widely believed to be a front runner for the Nobel Prize in Physics. Surprisingly, his abilities in physics were almost lost. Born in Belfast in 1928, he failed to win a scholarship to grammar school and left school at 16. It was when he was working as a laboratory assistant in Queen’s University that his talent was spotted by Professors Karl Emelaus and Robert Sloane, who encouraged him to attend first-year lectures. Bell then enrolled, obtaining two first-class honours degrees at Queen’s followed by a PhD in the University of Birmingham. He then worked at the UK Atomic Energy Research Establishment, before moving to CERN. Professor Mary Daly, President of the Royal Irish Academy said ‘The Academy wants John Bell to be the best known scientist in Northern Ireland and to be acknowledged as one of the most important scientists in the world’. A number of events are planned for John Bell Day – see the RIA for more information. Michael Nauenberg, John Bell’s Major Contribution to Physics and Philosophy, RIA. Help Wanted: Philosopher required to sort out Reality – Philosophy Now. A great straightforward overview, but available to subscribers only. Tim Maudlin, PBS: Why Physics Needs Philosophy Irish Times: Ireland’s rich history in science deserves acclaim. Opinion: time to give John Bell the recognition he deserves. Scientia Salon: Quantum mechanics and scientific realism – on the difficulties of creating a realist theory of quantum mechanics. — CERN (@CERN) November 4, 2014 — Queen's University (@QueensUBelfast) November 4, 2014 — Love Belfast (@love_belfast) November 4, 2014 John Bell: Belfast street to be named after physicist – a street next to Belfast Metropolitan College in the Titanic Quarter will be named Bell’s Theorem Way or Bell’s Theorem Crescent.
<urn:uuid:b3c054a0-637b-4529-b6d1-cea2c6643738>
CC-MAIN-2023-06
https://www.irishphilosophy.com/2014/11/04/john-stewart-bell/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764494974.98/warc/CC-MAIN-20230127065356-20230127095356-00298.warc.gz
en
0.952862
1,360
3.703125
4
Humanity is in a back-and-forth relationship with nature. First, we thought we were at the center of everything, with the Sun and the entire cosmos rotating around our little planet. We eventually realized that wasn’t true. Over the centuries, we’ve found that though Earth and life might be rare, our Sun is pretty normal, our Solar System is relatively non-descript, and even our galaxy is one of the billions of spiral galaxies, a type that makes up 60% of the galaxies in the Universe. But the Illustris TNG simulation shows that the Milky Way is special. The large scale structure of the universe is dominated by vast empty regions known as cosmic voids. These voids appear as holes hundreds of millions of light years across in the distribution of galaxies. However, new research shows that many of them may surprisingly still be filled with dark matter. In 1916, Einstein finished his Theory of General Relativity, which describes how gravitational forces alter the curvature of spacetime. Among other things, this theory predicted that the Universe is expanding, which was confirmed by the observations of Edwin Hubble in 1929. Since then, astronomers have looked farther into space (and hence, back in time) to measure how fast the Universe is expanding – aka. the Hubble Constant. These measurements have become increasingly accurate thanks to the discovery of the Cosmic Microwave Background (CMB) and observatories like the Hubble Space Telescope. Astronomers have traditionally done this in two ways: directly measuring it locally (using variable stars and supernovae) and indirectly based on redshift measurements of the CMB and cosmological models. Unfortunately, these two methods have produced different values over the past decade. As a result, astronomers have been looking for a possible solution to this problem, known as the “Hubble Tension.” According to a new paper by a team of astrophysicists, the existence of “Early Dark Energy” may be the solution cosmologists have been looking for. Constraints are critical in any scientific enterprise. If a hypothesis predicts that there should be an observable phenomenon, and there isn’t any trace of it, that’s a pretty clear indication that the hypothesis is wrong. And even false hypotheses still move science forward. So it is with astronomy and, in particular, explorations of the early universe. A paper authored by researchers at Cambridge and colleagues now puts a particularly useful constraint on the development of early galaxies, which has been a hot topic in astronomy as of late. For the first time, scientists have created a quantum computing experiment for studying the dynamics of wormholes — that is, shortcuts through spacetime that could get around relativity’s cosmic speed limits. “We found a quantum system that exhibits key properties of a gravitational wormhole, yet is sufficiently small to implement on today’s quantum hardware,” Caltech physicist Maria Spiropulu said in a news release. Spiropulu, the Nature paper’s senior author, is the principal investigator for a federally funded research program known as Quantum Communication Channels for Fundamental Physics. Don’t pack your bags for Alpha Centauri just yet: This wormhole simulation is nothing more than a simulation, analogous to a computer-generated black hole or supernova. And physicists still don’t see any conditions under which a traversable wormhole could actually be created. Someone would have to create negative energy first. According to the Standard Model of Particle Physics, the Universe is governed by four fundamental forces: electromagnetism, the weak nuclear force, the strong nuclear force, and gravity. Whereas the first three are described by Quantum Mechanics, gravity is described by Einstein’s Theory of General Relativity. Surprisingly, gravity is the one that presents the biggest challenges to physicists. While the theory accurately describes how gravity works for planets, stars, galaxies, and clusters, it does not apply perfectly at all scales. While General Relativity has been validated repeatedly over the past century (starting with the Eddington Eclipse Experiment in 1919), gaps still appear when scientists try to apply it at the quantum scale and to the Universe as a whole. According to a new study led by Simon Fraser University, an international team of researchers tested General Relativity on the largest of scales and concluded that it might need a tweak or two. This method could help scientists to resolve some of the biggest mysteries facing astrophysicists and cosmologists today. Johns Hopkins University (JHU) continues to pad its space community résumé with their interactive map, “The map of the observable Universe”, that takes viewers on a 13.7-billion-year-old tour of the cosmos from the present to the moments after the Big Bang. While JHU is responsible for creating the site, additional contributions were made by NASA, the European Space Agency, the National Science Foundation, and the Sloan Foundation. Something huge lurks in the shadows of the Universe. Known as the Great Attractor, it is causing the Milky Way and all the surrounding galaxies to rush towards it. We would normally have a better understanding of this situation, except for the fact that the Great Attractor happens to lie in the direction behind the galactic bulge, which makes it difficult for us to observe. A team of astronomers have performed a new infrared survey of the region behind the bulge, and they have found yet another large galaxy cluster. Their work is helping to paint a more complete portrait of the environment of the Great Attractor. In 2011, the Nobel Prize in physics was awarded to Perlmutter, Schmidt, and Reiss for their discovery that the universe is not just expanding, it is accelerating. The work supported the idea of a universe filled with dark energy and dark matter, and it was based on observations of distant supernovae. Particularly, Type Ia supernovae, which have consistent light curves we can use as standard candles to measure cosmic distances. Now a new study of more than 1,500 supernovae confirms dark energy and dark matter, but also raises questions about our cosmological models.
<urn:uuid:b1eecf7c-143f-4117-8904-ac7a9a064af6>
CC-MAIN-2023-06
https://www.universetoday.com/category/cosmology/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499816.79/warc/CC-MAIN-20230130101912-20230130131912-00059.warc.gz
en
0.941684
1,262
3.671875
4
Foundation of green computing was laid as far back as 1992 with the launching of Energy Star program in the USA. The success of Energy Star motivated other countries to take up the subject for investigation and implementation. Any technology that aspires to be nature-friendly ought to be green. Recognition of this fact has led to development of green generators, green automobiles, green energy, green chemistry as well as green computing. Green computing is a leap forward for information technology (IT), and more specifically for information and communication technology (ICT). Green computing has emerged as the next wave of ICT. Motivation for the subject of green computing arose to protect environment against hazards generated at three different states of ICT, namely, information collection (by electronic devices), information processing (through algorithms and storage) and information transportation (through networking and communication). Carbon dioxide accounts for about eighty per cent of global warming. As a rule of thumb, if world-wide increasing application of ICT is assumed to contribute at least twenty per cent towards carbon dioxide, ICT becomes responsible for sixteen per cent of global warming. This is undoubtedly a cause of concern. As per one research-based estimate, fifty billion devices like computers, mobile phones, sensors, actuators and robots shall connect to the Internet by this year’s end, creating even more havoc. Of course different strategies would be needed to nudge ICT towards green computing, which is necessary to reduce the pollutants generated in collection, processing and transportation of the information. In today’s scenario, primary challenge in achieving green computing is to realise energy-efficient devices, energy-efficient processing and energy-efficient networking. Invariably, energy efficiency is required to address reduction in heat dissipation that is basically responsible for emission of carbon dioxide. In case of electrical, electronic or computer systems, wasteful heat is generated due to thermal vibration of particles in the components. Therefore any brute attempt towards green initiative should have direct or indirect motivation to reduce this thermal vibration. Reduced circuitry or a minimal system helps in reducing the number of vibrating particles. Minimal circuit designs, which lead to technologies of very large scale integration (VLSI) or ultra large scale integration (ULSI), are now well-established technical solutions. These solutions meet the objectives of realising low cost and smaller-size systems. It was never thought these would also indirectly provide a solution for reducing particles in vibration. In the process of minimisation, two more revolutionary technologies have emerged: molecular scale of electronics (MSE) and Quantum computing. It was the quest for an ever decreasing size, but more complex electronic components with high speed ability, which gave rise to MSE. The concept that molecules may be designed to operate as a self-contained device was put forwarded by Carter. He proposed some molecular analogous of conventional electronic switches, gates and connections. Accordingly, an idea of a molecular P-N junction emerged. MSE is a simple interpolation of IC scaling. Scaling is an attractive technology. Scaling of FET and MOS transistors is more rigorous and well defined than that of bipolar transistors. But there are problems in scaling of silicon technology. In scaling, while propagation delay should be minimum and packing density should be high, these should not be at the expense of the power dissipated. With these scaling rules in minds, scaling technology of silicon is reaching a limit. Dr Barker reported that, “change, spin, conformation, colour, reactivity and lock-and-key recognition are just a few examples of molecular properties, which might be useful for representing and transforming logical information. To be useful, molecular scale logic will have to function close to the information theoretical limit of one bit on one carrier. Experimental practicalities suggest that it will be too easy to construct regular molecular arrays, preferably by chemical and physical self-organisation. This suggests that the natural logic architectures should be cellular automata: regular arrays of locally connected finite state machines where the state of each molecule might be represented by colour or by conformation. Schemes such as spectral hole burning already exist for storing and retrieving information in molecular arrays using light. The general problem of interfacing to a molecular system remains problematic. Molecular structures may be the first to take practical advantages of novel logic concepts such as emergent computation and ‘floating architecture’ in which computation is viewed as a self-organising process in a fluid-like medium.” Change is the only thing that is permanent in universe. In technology scenario, changes become inevitable means of evolution and revolution. In tune, a new generation of IT known as Quantum Computing (QC) has come up. Mechanical computing, electronic computing, quantum computing, DNA computing, cloud computing, chemical computing and bio computing are a few generation-wise migrations of information technology (IT). In conventional computers we work with now, computing and processing of data is based on transistors’ on and off states as binary representation of ‘1’ or ‘0,’ or vice versa. In quantum computers, the basic principle is to use quantum properties to represent data. Here, computation and processing of data is made with quantum mechanical phenomena such as superposition, parallelism and entanglement. Therefore, whereas in conventional computers data is represented by binary ‘bits,’ in quantum computers representation is done with ‘qubits’ (quantum bits). Qubits are typically subatomic particles such as electrons and photons. Generating, processing and managing qubits is an engineering challenge. Superiority of quantum computing over classical computing is multi-fold. First, whereas in classical computing logical bits are represented by on and off states of transistors, in quantum computing qubits are harnessed by properties of subatomic particles. Size of quantum computers will thus be much smaller than that of present-day computers. Both MSE and QC are thus found to be indirect solutions for green computing. At current state of technology march, Green ICT may be better looked at as a challenge to realise eco-friendly and environmentally-responsible solutions in order to reduce just not heat dissipation but also to maximise energy efficiency, recyclability and bio-degradability. Fact is, fast-growing production of electrical, electronic and computing equipment has resulted in enormous increase of e-waste, and especially carbon dioxide, which is responsible for creating havoc in environment and for increasing pollution. As per a report published by International Telecommunication Union (ITU), e-waste has increased rapidly and reached a global high. Increasing trend of e-waste all over the world is shown in Fig. 1. Many studies have established that computers and IT industries dissipate more energy than others. Impact of ICT industries on emission of carbon-dioxide is immense. As shown in Fig. 2, India is currently the third largest producer of carbon dioxide. Urgent solutions required at the levels of hardware design management include minimal configuration, adoptive configuration, consolidation by virtualisation, algorithmic efficiency, optimal resource utilisation, optimal data centres, optimal link utilisation, limiting power by reducing cable length, minimising protocol overhead, protocol for compressed header, green networking, management of e-waste, air management and cooling management, among others. For ICT scientists and engineers, the challenge will be to design technology and algorithms to minimise particle vibration, travel path and heat loss due to input-output mismatch. Design, operational and transmission related thermal loss are core issues of ICT. This makes production of Green ICT a great challenge, although, as parts of its implementation, energy- smart devices, sleep-mode devices, cluster computing, cloud computing, etc are already in place. Foundation of green ICT was laid as far back as 1992 with the launching of Energy Star program in the USA. The success of Energy Star motivated other countries to take up the subject for investigation and implementation. Leading countries working on green ICT now include Japan, Australia, Canada and The European Union. Formalisation of green ICT is in fact due to standards proposed by IEEE who has formalised Green Ethernet and 802.3az-enabled devices for green ICT. Green ICT is a clean-environment-based technology. However, fruitful realisation of green ICT is equally dependent upon awareness in society. Society needs to practice common ethics of ‘don’t keep computer on, when not needed,’ ‘don’t use Internet as a free tool, but as a valuable tool of necessity only,’ ‘don’t unnecessarily replace devices after devices just because you can afford to’ and so on. Without societal responsibility, technology alone cannot ensure achieving the objectives of green ICT. Prof. Chandan Tilak Bhunia, PhD in computer engineering from Jadavpur University, is fellow of Computer Society of India, Institution of Electronics & Telecommunication Engineers, and Institution of Engineers (India) Abhinandan Bhunia did B S in computer engineering from Drexel University, USA and MBA from University of Washington, USA
<urn:uuid:37e78868-9ed3-4e0f-bcd0-b96ff17b61c0>
CC-MAIN-2023-06
https://www.electronicsforu.com/technology-trends/tech-focus/green-computing-importance
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499831.97/warc/CC-MAIN-20230130232547-20230131022547-00541.warc.gz
en
0.927703
1,868
3.625
4
“Study by the University of Bonn could pave the way to new types of highly sensitive sensors Researchers at the University of Bonn have created a gas of light particles that can be extremely compressed. Their results confirm the predictions of central theories of quantum physics. The findings could also point the way to new types of sensors that can measure minute forces. The study is published in the journal Science. If you plug the outlet of an air pump with your finger, you can still push its piston down. The reason: Gases are fairly easy to compress - unlike liquids, for example. If the pump contained water instead of air, it would be essentially impossible to move the piston, even with the greatest effort. Gases usually consist of atoms or molecules that swirl more or less quickly through space. It is quite similar with light: Its smallest building blocks are photons, which in some respect behave like particles. And these photons can also be treated as a gas, however, one that behaves somewhat unusually: You can compress it under certain conditions with almost no effort. At least that is what theory predicts. Photons in the mirror box Researchers from the Institute of Applied Physics (IAP) at the University of Bonn have now demonstrated this very effect in experiments for the first time. “To do this, we stored light particles in a tiny box made of mirrors,” explains Dr. Julian Schmitt of the IAP, who is a principal investigator in the group of Prof. Dr. Martin Weitz. “The more photons we put in there, the denser the photon gas became.” The rule is usually: The denser a gas, the harder it is to compress. This is also the case with the plugged air pump - at first the piston can be pushed down very easily, but at some point it can hardly be moved any further, even when applying a lot of force. The Bonn experiments were initially similar: The more photons they put into the mirror box, the more difficult it became to compress the gas. However, the behavior changed abruptly at a certain point: As soon as the photon gas exceeded a specific density, it could suddenly be compressed with almost no resistance. “This effect results from the rules of quantum mechanics,” explains Schmitt, who is also an associate member of the Cluster of Excellence “Matter and Light for Quantum Computing” and project leader in the Transregio Collaborative Research Center 185. The reason: The light particles exhibit a “fuzziness” - in simple terms, their location is somewhat blurred. As they come very close to each other at high densities, the photons begin to overlap. Physicists then also speak of a “quantum degeneracy” of the gas. And it becomes much easier to compress such a quantum degenerate gas. If the overlap is strong enough, the light particles fuse to form a kind of super-photon, a Bose-Einstein condensate. In very simplified terms, this process can be compared to the freezing of water: In a liquid state, the water molecules are disordered; then, at the freezing point, the first ice crystals form, which eventually merge into an extended, highly ordered ice layer. “Islands of order” are also formed just before the formation of the Bose-Einstein condensate, and they become larger and larger with the further addition of photons. The condensate is formed only when these islands have grown so much that the order extends over the entire mirror box containing the photons. This can be compared to a lake on which independent ice floes have finally joined together to form a uniform surface. Naturally, this requires a much larger number of light particles in an extended box as compared to a small one. “We were able to demonstrate this relation in our experiments,” Schmitt points out. To create a gas with variable particle number and well-defined temperature, the researchers use a “heat bath”: “We insert molecules into the mirror box that can absorb the photons,” Schmitt explains. “Subsequently, they emit new photons that on average possess the temperature of the molecules - in our case, just under 300 Kelvin, which is about room temperature.” The researchers also had to overcome another obstacle: Photon gases are usually not uniformly dense - there are far more particles in some places than in others. This is due to the shape of the trap which they are usually contained in. “We took a different approach in our experiments,” says Erik Busley, first author of the publication. “We capture the photons in a flat-bottom mirror box that we created using a microstructuring method. This enabled us to create a homogeneous quantum gas of photons for the first time.” In the future, the quantum-enhanced compressibility of the gas will enable research into novel sensors that could measure tiny forces. Besides technological prospects, the results are also of great interest for fundamental research. The study was supported by the German Research Foundation (DFG) within the collaborative research center TRR 185 “OSCAR – Open System Control of Atomic and Photonic Matter” and the cluster of excellence “Matter and Light for Quantum Computing (ML4Q)“, and by the European Union within the framework of the quantum flagship project “PhoQuS – Photons for Quantum Simulation”. Publication: Erik Busley, Leon Espert Miranda, Andreas Redmann, Christian Kurtscheid, Kirankumar Karkihalli Umesh, Frank Vewinger, Martin Weitz and Julian Schmitt: Compressibility and the Equation of State of an Optical Quantum Gas in a Box; Science; DOI: https://doi.org/10.1126/science.abm2543”
<urn:uuid:43ada542-bf58-48ef-b8e2-918484eec54b>
CC-MAIN-2023-06
https://jpralves.net/post/2022/03/31/physicists-create-extremely-compressible-gas-of-light.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499646.23/warc/CC-MAIN-20230128153513-20230128183513-00499.warc.gz
en
0.946013
1,223
3.515625
4
The race to make good on quantum computing is well underway. Millions of dollars have been allocated to developing machines that could cause current computers to become obsolete. But, what is the difference between quantum and classical computing? This is a puzzle that is beginning to be unraveled. A few months ago, IBM unveiled the first quantum computer, the Q System. For newcomers to this computing paradigm, IBM explained that the quantum computer could solve (much more quickly than traditional computers) a set of much more complex calculations. “Qubits” were discussed as units of value, outpacing the traditional bits of classical computing. To understand how a quantum computer works, and the quantum mechanics on which it is based, we should look back to the beginning of the 20th century, when this physical theory was first raised. Among other subjects of study, quantum physics began with the study of an atom's particles and its electrons at a microscopic scale, something that had never been done before. Arnau Riera — doctor in theoretical physics; high school teacher; and advisor to Quantum, an exhibition hosted at the Center of Contemporary Culture of Barcelona (CCCB) — defines it as a conceptual change. "In the classical world, the properties of the systems that we study are well defined. In the quantum world, this isn’t the case: particles can have different values, they are not isolated objects, their states are diluted," he explains. Quantum physics is so complex that even Richard Feyman, 1965 Nobel Laureate in Physics and one of the fathers of quantum computing in the 1980s famously said, "I think I can safely say that nobody understands quantum mechanics”. As the reality of a quantum computer comes closer, it is useful for us to understand both how one functions and how it’s different from a traditional computer. The first thing to bear in mind is that they use different basic units of data: 'bits' and 'qubits'. Every element of a classical computer is written in binary code (1s and 0s) and is translated into electricity: high voltage is represented by 1, and low voltage by 0. In quantum computing, qubits are the basic unit and their value can be 1, 0, or 1 and 0 simultaneously, overlapping (superposition) and intertwining (entanglement) according to the laws of physics. This means that qubits, as opposed to bits, can take on various values at one time and can perform calculations that a conventional computer cannot. Juan José García Ripoll, researcher at the Institute of Fundamental Physics within the Spanish National Research Council, provides more clues. "In classical computing we know how to solve problems thanks to computer language (AND, OR NOT) used when programming. Operations that are not feasible in bit computing can be performed with a quantum computer. In a quantum computer all the numbers and possibilities that can be created with N qubits are superimposed (if there are 3 qubits, there will be 8 simultaneous possible permutations.) With 1,000 qubits the exponential possibilities far exceed those that we have in classical computing”. Currently, in contrast to classical computing, there are no quantum computing languages per se. Researchers work on developing algorithms (mathematical models that classical computers also work with) that can provide concrete solutions to the problems that are presented. "They work differently. A quantum computer isn't suitable for performing day-to-day tasks", Garcia Ripoll explains. "They don't have memory or a processor. We only have a group of qubits that we use to write information, and we work with those. There isn't an architecture as complicated as the architecture for a conventional computer. Today, quantum machines are primitive systems akin to a calculator at the turn of the last century, but their computing power for very specific problems is much greater than a traditional computer's. There is a dichotomy between what appears very simple and what it does, which is very powerful,” García Ripoll points out. What is a quantum computer like and under what conditions does it work? When IBM unveiled its quantum computer, many people were surprised by what it looked like. There were no screens, keyboards, or processors — computers elements we expect. In the photos there appears a bell-shaped machine covered in copper wires, then by a protective glass case. Only people who work with quantum computers and researchers can get close to this equipment. Researchers at CSIC use traditional computers and the cloud to interact with the quantum computers used for their research. “What we have available are prototypes that are very sensitive; they experience errors. They are very complex technically because as soon as an external agent influences or interacts with a quantum system, the qubits register it and fall out of superposition”, explains Riera. Whereas, with a classical computer, if there is interference with the system, the system can correct itself and continue running. For the time being, this is not the case with quantum computers. "External disturbances force the system to define itself as 1 or 0, causing it to lose its quantum coherence. To avoid this kind of external ‘noise,’ the system has to be completely isolated: the atoms have to be very quiet, ensuring nothing makes them collide or interact with the surroundings. This kind of ‘still state’ requires exact temperatures and processes,” the doctor in theoretical physics explains. Quantum computers have to be at a temperature of -273 °C (-459 °F) with hardly any atmospheric pressure and isolated from Earth's magnetic field. At the same time, information cannot be stored in a quantum computer because its operational window is very short. "It's computing time is finite: at some point the quantum properties of the computer are destroyed. They run for very short periods of time. We have to think about how to make the most of those timeframes and extract data in a very exact manner,” García Ripoll explains. What can we do with a quantum computer? Areas where quantum computing can deliver new applications and developments range from the pharmaceutical industry and medicine research, the creation of new materials, and even what is being called “quantum finance” — an area in which BBVA has already taken an interest. In this sector, we can use classical computing and mathematical algorithms to make predictions about the future risk of a portfolio or we can study the stock market during a window of time. But quantum computing opens a completely new range of options to be explored. "A quantum computer can create superposition with multiple probabilities that we cannot achieve today, let alone examine the features of those probabilities. With this type of application, the quantum computer will be much more efficient than a classical computer,” asserts García Ripoll. Despite all the possibilities promised by quantum computing, we mustn't get ahead of ourselves, particularly in everyday life. We won't see massive improvements in speed when downloading videos; nor will video game players benefit from even better graphics cards. Researchers are working on algorithms and mathematical models so that in a near future tasks that take a long time today can be executed more efficiently. "Quantum computing is just getting started, we are very much in the early days," concludes Garcáa Ripoll.
<urn:uuid:c6cc4f05-728d-415a-97d5-a4b020ce0a76>
CC-MAIN-2023-06
https://www.bbva.com/en/quantum-computing-how-it-differs-from-classical-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764495001.99/warc/CC-MAIN-20230127164242-20230127194242-00660.warc.gz
en
0.956636
1,484
3.5
4
Ultra-cold temperature physics opens way to understanding and applications Researchers doing ultra-cold temperature physics may not have to wear parkas, but they are producing the coldest temperatures ever and exploring model quantum systems that might lead to more accurate clocks and gyroscopes, quantum computers and communications as well as a better understanding of quantum physics phenomena. Nearly 80 years ago, Albert Einstein and Satyendra Nath Bose predicted that gases of atoms cooled down very close to absolute zero would behave in unison. In 1995, three laboratories produced such Bose-Einstein condensates and opened the door for investigation of physical properties of atoms on a very cold scale. David S. Weiss, associate professor of physics, Penn State, described recent research in one-dimensional quantum systems at the annual meeting of the American Association for the Advancement of Science today (Feb. 20) in Washington, D.C. “These ultra-cold atoms can act as model systems to help us understand other quantum systems,” says Weiss. “Their interactions can be calculated and controlled very accurately.” In a Bose-Einstein condensate, alkali metal atoms are cooled using lasers and a form of evaporation until they are a hair above absolute zero. Bosons, a class of particles that prefer to share the same energy state, when cooled this cold, begin to act in unison. The atoms wave functions — the description of each atoms position and momentum – all become identical. Initially, Bose-Einstein condensates were confined in featureless magnetic traps, but researchers have taken the experiments further. “By putting Bose-condensed atoms into versatile light traps, we can make atomic wave functions exhibit remarkable behavior,” says Weiss. “Most known quantum phenomena can be studied clearly with ultra-cold atoms, and as yet unknown phenomena can be conceived and observed.” The traps Weiss refers to are light traps created by lasers. By reflecting laser light back on itself, researchers create unmoving standing waves that, if created in a three-dimensional grid, can trap atoms. When this type of grid is superimposed over a Bose-Einstein condensate, the atoms segregate into individual traps, creating a matrix of tiny cells with ultra-cold atoms inside. Turning the lattice on and off can switch the system from a superfluid to something called a Mott insulator and back to a superfluid. Superfluids and Mott insulators have different quantum characteristics. Weiss, who is using rubidium 87, takes the grid one step further and creates a one-dimensional Tonks-Girardeau gas. By constraining the grid in two directions so that movement is only possible in one dimension, as if the atom were on a wire, Weiss creates a system where the bosons – rubidium 87 atoms – act like fermions. Fermions, unlike bosons, do not like to share energy states. Even near zero temperature, they avoid each other. In superconductivity, fermions act like bosons. In a Tonks-Girardeau gas, strongly interacting bosons act as non-interacting fermions. “A one-dimensional Tonks-Girardeau gas is one of very few many-particle systems that can be exactly solved mathematically,” says Weiss. “This was done in the 60s, but there had been no experimental system.” Now, Weiss can experimentally verify the mathematical calculations. Using these techniques, researchers may be able to understand superconductivity better, form quantum molecules and perhaps eventually create quantum computers. Along with rubidium, some other potential elements for Bose-Einstein condensates and ultra-cold quantum physics are sodium, cesium, lithium and ytterbium. Weiss considers quantum computing a promising way to use ultra cold atoms. The atoms can act as quantum bits, or qubits, with internal sub-states functioning as the ubiquitous 0 and 1s of computing. “However, quantum computers can only do a certain class of calculations, factoring large numbers for example,” says Weiss. “They might also be used to simulate other quantum mechanical systems, answering questions that are simply not answerable with any conceivable classic computer.” Superfluid clouds of atoms and grid-constrained super cold atoms are not the only possibilities researchers are exploring in ultra cold quantum physics. Other related areas of research include lattices of atomic vortices, coherent quantum chemistry and atomic interferometry. All latest news from the category: Physics and Astronomy This area deals with the fundamental laws and building blocks of nature and how they interact, the properties and the behavior of matter, and research into space and time and their structures. innovations-report provides in-depth reports and articles on subjects such as astrophysics, laser technologies, nuclear, quantum, particle and solid-state physics, nanotechnologies, planetary research and findings (Mars, Venus) and developments related to the Hubble Telescope.
<urn:uuid:865b46d7-7784-429c-a819-2930ad4bcd9f>
CC-MAIN-2023-06
https://www.innovations-report.com/physics-and-astronomy/report-40700/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764495012.84/warc/CC-MAIN-20230127195946-20230127225946-00780.warc.gz
en
0.915876
1,049
3.734375
4
Unraveling forces on cells with artificial cilia Jaap den Toonder, Professor of Microsystems, is developing a completely new system to better understand the effect of forces and flows on cells and tissues. He is using a special laser to create artificial cilia, inspired by vibrating hairs that occur in nature. Den Toonder is receiving an ERC Advanced Grant of 3 million euros to carry out this research. Almost every process in biology, from embryonic development to organ function and the incidence of disease, is based on biomechanical interactions between cells and their environment. If we understand these interactions, we can also better understand, for example, the spread of tumor cells or the brittleness of bones. The forces and flows between cells are often investigated by imitating fluid flows with valves and pumps, but this does not allow you to achieve the precision and control needed to make further steps in this research. Vibrating hairs inspired Den Toonder to build a new system, with which you can precisely control and study these forces and flows in a laboratory environment. Vibrating hairs, or cilia, are ultra-thin microscopic hairs, which move tightly packed together like a crowd doing the 'wave' in a stadium. Cilia are found everywhere in nature, also in our human body where they play an important role. Their synchronized movement, for example, helps to remove mucus from the lungs and transport eggs from the ovaries to the uterus. By regulating how the fluid flows around an embryo, vibrating hairs even ensure that organs such as the heart develop on the correct side of the body. Just like real cilia, the artificial hairs must, after an environmental signal, be able to initiate a flow in a fluid or exert mechanical forces on their environment. Then they must be able to detect the forces of reaction from the environment. And all in the same hair. Den Toonder: "The cilia we want to build consist of flexible polymers with magnetic nanoparticles. By activating them with an electromagnet, we can make the hairs move locally exactly as we want them to. This enables us to generate a flow in the surrounding fluid or forces on cells that we grow in the vicinity of the vibrating hairs. We then want to measure the biomechanical response of the cells very accurately." The cilia that Den Toonder wants to build are only ten micrometers long and no thicker than one micrometer. He also wants to place the hairs very close to each other and give them just the right flexibility to be easily moved by the magnetic fields. To build them, Den Toonder needs a brand new laser with a small focal point and ultra-short pulses. The laser inscribes very precise structures on a micro scale in a glass plate, which then serves as a mold with which the cilia are formed by means of a casting process. Den Toonder then places these hairs in a so-called microfluidic chip, a piece of plastic with small fluid channels, in which cells and tissues can also be grown. "We can vary the pattern in which we apply the hairs in the chip. For each biomechanical process we want to study, we make a specific chip. For example, compare it to a CD and CD player. The CD is the chip, it is replaceable. The CD player is our entire system of electromagnet, control and measuring equipment", says Den Toonder. Besides Den Toonder, Erik Bakkers is also a recipient of an ERC Advanced Grant. Read more about his research below. Demonstrate teleportation of Majorana particles with new nanomaterial Erik Bakkers, Professor of Advanced Nanomaterials & Devices, focuses his research on a new nanomaterial and thereby hopes to conclusively demonstrate the teleportation of Majorana particles. This is an essential step in the construction of the Majorana quantum computer. Bakkers will receive anAdvanced Grant of 2.5 million euros. The award to Erik Bakkers builds on his highly successful ERC Consolidator Grant of 2013 that helped fund his presentation in 2017 of an advanced quantum chip with nano-hashtags and consequently in 2018 the long expected zero-bias peak, which has exactly the same peak as was predicted by the Majorana theory. Bakkers: "These results are extremely important, but also showed us that the current combination of semiconductor (indium antimonide) and superconductor (aluminum) is not ideal for the next step in Majorana research. "The transition between these two materials is not very sharp, because the aluminum reacts chemically with the indium antimonide. In addition, high magnetic fields are required to reach the required topological state, which is very difficult. The topology is intended to protect the Majorana particle so that it is much more stable than other quantum states. Robust crystal lattice Bakkers therefore wants to use the Advanced Grant to develop a new material combination: topological crystalline insulator nanowires of tin telluride coupled to the superconductor lead. This material occurs naturally in a topological state, which is formed by the symmetry of the crystal lattice. Because the crystal lattice of this material is very simple, the same as that of kitchen salt, everything is much more robust. Lead is also a stronger superconductor than aluminum and this combination should make it easier to find and manipulate Majorana conditions. Bakkers begins the research by growing high-quality tinelluride nanowires. For this growth process he also wants to use a growth strategy that has never been used for these materials before, namely a high-vacuum technique (Molecular Beam Epitaxy) to produce extremely pure material. "The results from the earlier ERC study already gave strong indications of the presence of Majorana particles. But in order to really demonstrate their presence, two things have to be proven: teleportation and interdependence. Using this Advanced Grant I want to prove teleportation", says Bakkers. This requires an entangled pair of particles to appear on both sides of the nanowire and these states must be linked. Bakkers: "For example, if I change the electric field on one side, the particle on the other side must simultaneously show the same change." Quantum teleportation forms the basis of the qubit, the building block of the Majorana quantum computer. "That application is on the distant horizon", says Bakkers.
<urn:uuid:e4d127f6-c554-427b-8087-5722bbd46f71>
CC-MAIN-2023-06
https://www.cursor.tue.nl/en/news/2019/maart/week-4/erc-advanced-grants-for-tue-professors-bakkers-and-den-toonder/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500904.44/warc/CC-MAIN-20230208191211-20230208221211-00343.warc.gz
en
0.942458
1,335
3.765625
4
Understanding the full extent of climate change and its effects can be complicated. But the issue causing climate change can be understood relatively simply: there is too much carbon dioxide in our atmosphere. To deal with that issue, we focus largely on how we can reduce the amount of carbon that we are emitting, but there’s another approach as well: removing the carbon that is already in the atmosphere. Accomplishing that task is currently costly and more theoretical than it is practical, but breakthroughs in quantum computing may hold the key to quickly, efficiently and effectively sucking pollution right out of the sky. The concept of quantum computing provides promise for a lot of industries, from medical research to weather modeling. But one of the areas it could provide the most positive impact is in addressing climate change – particularly when it comes to capturing carbon and cutting down on the amount of energy that we are using. The new model of computation would unlock a level of computing power that is currently unachievable by conventional computers, which opens up the possibility of new models and simulations – new insights into the world that we can’t currently see, including new methods to contain and eliminate the emissions that we have spent more than a century sending up into the atmosphere. What is quantum computing? To understand the potential of quantum computing, it’s important to understand what exactly quantum computing is – which is no small task, seeing as it relies on theoretical physics. Traditional computers rely on bits to store data, which you have likely seen represented with 0s and 1s. Quantum computing uses a mechanical phenomenon known as quantum bits or qubits to store information, and qubits do not have the same binary restrictions as their traditional counterparts. Rather than being a 0 or a 1, a qubit can be a 0, a 1 or both simultaneously. This technical achievement is enabled by microscopic particles like electrons and photons that can occupy different states at the same time, as long as they are not observed. If you’re familiar with the Schrödinger’s cat thought experiment, then you’ll at least have an idea of how this works: essentially, until you look at something, you never truly know what state it is in so it can be in multiple states at once. In recent years, quantum computing has broken from the realm of the theoretical and become more of a reality. Earlier this year, Google claimed “quantum supremacy” – an achievement accomplished by building a quantum processor capable of completing computations that would essentially be impossible for any traditional computer to process. The company claimed its processor completed in just 200 seconds a task that would have taken a traditional computer about 10,000 years to do. Google’s claims were called into question by competitors, but regardless if Google achieved true quantum supremacy, it did prove the viability of quantum computing and opened the door to new developments and breakthroughs in computing power. The uncertainty of climate change Enter climate change. We know that the planet is warming. According to the National Oceanic and Atmospheric Administration, the Earth’s average surface temperature has risen about 1.62 degrees Fahrenheit (about 0.9 degrees Celsius) since the late 19th century. We also know that during that time frame, humans have pumped more carbon dioxide and other emissions into the atmosphere than at any other time in human history. We have a wealth of data documenting these changes – enough so that there is a scientific consensus that human-caused climate change is real. What we don’t have at this point is a reliable way to understand the effects of these changes or predict future outcomes. Scientists do have tools that they use to project the potential changes that the planet might experience because of climate change, but those models are largely limited by traditional computing power. If you’ve ever opened your weather app and found that the forecast was entirely wrong, you’ve experienced the shortcomings of current modeling systems. Meteorologists and scientists do the best they can with the tools they have, but there really is no surefire way to project how our emissions are affecting the atmosphere and what sort of long-term outcomes we might experience because of it. How quantum computing can help address climate change Quantum computing can close that gap – and more than that, it might contain a key to solving our emissions problem. Because the computational power of quantum processors are multitudes more powerful than traditional alternatives, computer models can become much more accurate. By feeding larger datasets into the machine and having that information processed quicker and more efficiently than ever before, we can get a clearer view of what exactly climate change is doing to the planet and what might be on the horizon for us. These models can also extend to understanding large complex molecules – something that traditional computers are effectively unable to accomplish. A report from the World Economic Forum explains this is because simulating a complex molecule requires exponentially more computer power with every atom added, and by the time you attempt to render a molecule with 70 atoms, it would take a traditional computer about 13 billion years to accomplish that. Quantum computing could allow us to finally accurately simulate complex molecules, which would open up the possibility of understanding exactly how carbon dioxide would react to different methods for capturing and processing it. This would allow scientists to determine the best ways to literally suck carbon out of the atmosphere, as well as discover new methods to recycle and reuse existing carbon rather than pumping out more emissions. If we know, with reasonable accuracy, how carbon reacts to different ways of interacting with it through simulations, we can finally take action to remove the harmful gas from our atmosphere. Carbon capture is something that has been on the minds of scientists for decades now, with new tools on the horizon that can help to suck emissions out of the sky and put them to use again. Recent breakthroughs suggest that it is possible to turn greenhouse gas emissions into a fuel source that can be reused. These types of developments are not a suitable replacement for lowering our levels of emissions and ending the practice of indiscriminately pumping carbon dioxide and other harmful greenhouse gases into the atmosphere, but it provides a serviceable middle ground between continuing down our current path and finally embracing the reality of climate change and taking the drastic action needed to prevent the most devastating effects that are looming in the future. Quantum computing may finally unlock the technology that we need to remove as much carbon as possible from the atmosphere and put it to good use. Until we finally achieve net-zero carbon emissions – something that the United Nations’ Intergovernmental Panel on Climate Change believes we need to accomplish by 2050 if we have any hope of limiting the impact of climate change – finding a worthwhile way to suck up excess carbon and put it to work would come as a marked improvement. If we have to have excess carbon in the atmosphere, we might as well make good use of it. Article by channel: Everything you need to know about Digital Transformation The best articles, news and events direct to your inbox Read more articles tagged: Climate Change
<urn:uuid:7dc49f3e-e64c-4328-9520-83c3f9b12ee4>
CC-MAIN-2023-06
https://www.thedigitaltransformationpeople.com/channels/sustainability/how-quantum-computing-could-help-solve-climate-change/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500983.76/warc/CC-MAIN-20230208222635-20230209012635-00464.warc.gz
en
0.956015
1,414
4.15625
4
Vuckovic’s Stanford team is developing materials that can trap a single, isolated electron. Working with collaborators worldwide, they have recently tested three different approaches to the problem, one of which can operate at room temperature – a critical step if quantum computing is going to become a practical tool. In all three cases the group started with semiconductor crystals, material with a regular atomic lattice like the girders of a skyscraper. By slightly altering this lattice, they sought to create a structure in which the atomic forces exerted by the material could confine a spinning electron. “We are trying to develop the basic working unit of a quantum chip, the equivalent of the transistor on a silicon chip,” Vuckovic said. One way to create this laser-electron interaction chamber is through a structure known as a quantum dot. Physically, the quantum dot is a small amount of indium arsenide inside a crystal of gallium arsenide. The atomic properties of the two materials are known to trap a spinning electron. In a recent paper in Nature Physics, Kevin Fischer, a graduate student in the Vuckovic lab, describes how the laser-electron processes can be exploited within such a quantum dot to control the input and output of light. By sending more laser power to the quantum dot, the researchers could force it to emit exactly two photons rather than one. They say the quantum dot has practical advantages over other leading quantum computing platforms but still requires cryogenic cooling, so it may not be useful for general-purpose computing. However, it could have applications in creating tamper-proof communications networks. In two other papers Vuckovic took a different approach to electron capture, by modifying a single crystal to trap light in what is called a color center. In a recent paper published in NanoLetters, her team focused on color centers in diamond. In nature the crystalline lattice of a diamond consists of carbon atoms. Jingyuan Linda Zhang, a graduate student in Vuckovic’s lab, described how a 16-member research team replaced some of those carbon atoms with silicon atoms. This one alteration created color centers that effectively trapped spinning electrons in the diamond lattice. But like the quantum dot, most diamond color center experiments require cryogenic cooling. Though that is an improvement over other approaches that required even more elaborate cooling, Vuckovic wanted to do better. So she worked with another global team to experiment with a third material, silicon carbide. Commonly known as carborundum, silicon carbide is a hard, transparent crystal used to make clutch plates, brake pads and bulletproof vests. Prior research had shown that silicon carbide could be modified to create color centers at room temperature. But this potential had not yet been made efficient enough to yield a quantum chip. Silicon carbide is a promising platform for single photon sources, quantum bits (qubits), and nanoscale sensors based on individual color centers. Toward this goal, we develop a scalable array of nanopillars incorporating single silicon vacancy centers in 4H-SiC, readily available for efficient interfacing with free-space objective and lensed-fibers. A commercially obtained substrate is irradiated with 2 MeV electron beams to create vacancies. Subsequent lithographic process forms 800 nm tall nanopillars with 400–1400 nm diameters. We obtain high collection efficiency of up to 22 kcounts/s optical saturation rates from a single silicon vacancy center while preserving the single photon emission and the optically induced electron-spin polarization properties. Our study demonstrates silicon carbide as a readily available platform for scalable quantum photonics architecture relying on single photon sources and qubits. Vuckovic’s team knocked certain silicon atoms out of the silicon carbide lattice in a way that created highly efficient color centers. They also fabricated nanowire structures around the color centers to improve the extraction of photons. Radulaski was the first author on that experiment, which is described in another NanoLetters paper. She said the net results – an efficient color center, operating at room temperature, in a material familiar to industry – were huge pluses. “We think we’ve demonstrated a practical approach to making a quantum chip,” Radulaski said. But the field is still in its early days and electron tapping is no simple feat. Even the researchers aren’t sure which method or methods will win out. “We don’t know yet which approach is best, so we continue to experiment,” Vuckovic said. Brian Wang is a Futurist Thought Leader and a popular Science blogger with 1 million readers per month. His blog Nextbigfuture.com is ranked #1 Science News Blog. It covers many disruptive technology and trends including Space, Robotics, Artificial Intelligence, Medicine, Anti-aging Biotechnology, and Nanotechnology. Known for identifying cutting edge technologies, he is currently a Co-Founder of a startup and fundraiser for high potential early-stage companies. He is the Head of Research for Allocations for deep technology investments and an Angel Investor at Space Angels. A frequent speaker at corporations, he has been a TEDx speaker, a Singularity University speaker and guest at numerous interviews for radio and podcasts. He is open to public speaking and advising engagements.
<urn:uuid:f24c321a-d99e-415f-9068-d9b6a7028218>
CC-MAIN-2023-06
https://www.nextbigfuture.com/2017/05/quantum-computing-closer-to-reality-with-new-materials.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500365.52/warc/CC-MAIN-20230206212647-20230207002647-00664.warc.gz
en
0.922101
1,099
3.90625
4
We all mark days with clocks and calendars, but perhaps no timepiece is more immediate than a mirror. The changes we notice over the years vividly illustrate science's "arrow of time" — the likely progression from order to disorder. We cannot reverse this arrow any more than we can erase all our wrinkles or restore a shattered teacup to its original form. Or can we? An international team of scientists led by the U.S. Department of Energy's (DOE) Argonne National Laboratory explored this question in a first-of-its-kind experiment, managing to return a computer briefly to the past. The results, published March 13 in the journal Scientific Reports, suggest new paths for exploring the backward flow of time in quantum systems. They also open new possibilities for quantum computer program testing and error correction. A quantum computer able to effectively jump back and clean up errors as it works could operate far more efficiently. To achieve the time reversal, the research team developed an algorithm for IBM's public quantum computer that simulates the scattering of a particle. In classical physics, this might appear as a billiard ball struck by a cue, traveling in a line. But in the quantum world, one scattered particle takes on a fractured quality, spreading in multiple directions. To reverse its quantum evolution is like reversing the rings created when a stone is thrown into a pond. In nature, restoring this particle back to its original state — in essence, putting the broken teacup back together — is impossible. The main problem is that you would need a "supersystem," or external force, to manipulate the particle's quantum waves at every point. But, the researchers note, the timeline required for this supersystem to spontaneously appear and properly manipulate the quantum waves would extend longer than that of the universe itself. Undeterred, the team set out to determine how this complexity might be overcome, at least in principle. Their algorithm simulated an electron scattering by a two-level quantum system, “impersonated” by a quantum computer qubit — the basic unit of quantum information — and its related evolution in time. The electron goes from a localized, or "seen," state, to a scattered one. Then the algorithm throws the process in reverse, and the particle returns to its initial state — in other words, it moves back in time, if only by a tiny fraction of a second. Given that quantum mechanics is governed by probability rather than certainty, the odds for achieving this time-travel feat were pretty good: The algorithm delivered the same result 85 percent of the time in a two-qubit quantum computer. "We did what was considered impossible before," said Argonne senior scientist Valerii Vinokur, who led the research. The result deepens our understanding of how the second law of thermodynamics — that a system will always move from order to entropy and not the other way around — acts in the quantum world. The researchers demonstrated in previous work that, by teleportating information, a local violation of the second law was possible in a quantum system separated into remote parts that could balance each other out. “The results also give a nod to the idea that irreversibility results from measurement, highlighting the role that the concept of “measurement” plays in the very foundation of quantum physics,” said article coauthor Gordey Lesovik of the Moscow Institute of Physics and Technology. This is the same notion Austrian physicist Erwin Schrödinger captured with his famous thought experiment, in which a cat sealed in a box might remain both dead and alive until its status is monitored somehow. The researchers suspended their particle in this superposition, or form of quantum limbo, by limiting their measurements. "This was the essential part of our algorithm," Vinokur said. "We measured the state of the system in the very beginning and at the very end, but did not interfere in the middle." The finding may eventually enable better methods of error correction on quantum computers, where accumulated glitches generate heat and beget new ones. A quantum computer able to effectively jump back and clean up errors as it works could operate far more efficiently. "At this moment, it's very hard to imagine all the implications this can have," Vinokur said. "I am optimistic, and I believe that it will be many." The study also begs the question, can the researchers now figure out a way to make older folks young again? "Maybe," Vinokur jokes, "with the proper funding." The work was done by international team including researchers from the Moscow Institute of Physics and Technology (Gordey Lesovik, Andrey Lebedev, Mikhail Suslov), ETH Zurich (Andrey Lebedev) and Argonne National Laboratory, U.S. (Valerii Vinokur, Ivan Sadovskyy). Funding for this research was provided by the DOE Office of Science and Strategic Partnership Projects (Swiss National Foundation and the Foundation for the Advancement of Theoretical Physics “BASIS”). Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.
<urn:uuid:17fa7d69-02bc-4509-97b0-d5786ab2998f>
CC-MAIN-2023-06
https://www.eurekalert.org/news-releases/893598
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499541.63/warc/CC-MAIN-20230128090359-20230128120359-00265.warc.gz
en
0.936637
1,227
3.625
4
In 1933, Einstein, together with two younger colleagues – Boris Podolsky (1896-1966) and Nathan Rosen (1909-1995) – published a thought experiment that proved to be a particularly serious attack on quantum physics. Their publication became known in history as the EPR paradox. According to Einstein, Podolsky and Rosen, there was a conflict between the following statements of physics and of quantum mechanics: - An experiment can produce two particles with identical or strongly related values for certain properties. These particles will keep these values when undisturbed, even after a long period of time. - The behavior of the particles is subject to the conservation laws of physics. - The result of a measurement on a particle has a statistical uncertainty that, in principle, cannot be predicted according to quantum physics. - The uncertainty relationship of Heisenberg says that the position and the momentum (mass times speed) of a particle can never both be measured at the same time with unlimited precision. The more accurately its position is determined, the less accurately its momentum will be determined, and vice versa. About momentum and the law of impulse conservation The momentum of a particle is the impact a particle has in a collision. This depends on its speed and its mass. Even though a fly and a bus have the same speed, the impact they can deliver differs considerably. To accurately define impact in physics, the mass of an object is multiplied by its speed, this is momentum. The law of conservation of momentum is a physical theorem that states that the momentum of a closed system never changes. Two – or more – billiard balls that collide with each other therefore posses together the same total momentum before and after the collision, only distributed differently. The symbol in physics for impulse is p. So p = m v Question: Is it possible for a grain of sand and a brick to posses the same amount of momentum? The uncertainty principle of Heisenberg in an equation: Δq stands for the uncertainty in position when measuring a particle, Δp stands for the uncertainty in momentum when measured. The product of these two can’t be less than Planck’s constant (h) divided by 4π. That is a very small number – 1.05 x 10-34 – but is nevertheless an absolute lower limit on the possible accuracy of a measurement. This becomes important when measuring very small particles, such as electrons. This limit is not the result of the limited precision of our measuring instruments, but it is a fundamental property of observable nature as examined by physics. The EPR thought experiment EPR stands for Einstein-Podolsky-Rosen. What they proposed is this: Two identical particles A and B are initially at rest. They fly apart at time I. We wait with measuring them until they have traveled very far from each other. At time II we measure the momentum pA of particle A. Heisenberg does not prohibit us from measuring pA as accurately as we want. The position qA of of A then becomes inversely proportionally uncertain according to Heisenberg. Through the law of conservation of momentum, we now also know the momentum pB of particle B. This is the opposite to pA with exactly the same magnitude. At the same time, we measure the position qB of particle B. We can do that also as accurately as we wish. That should also be possible, according to Einstein, when Heisenberg is correct, because it is no longer connected to particle A. The momentum of B then becomes correspondingly uncertain according to Heisenberg. That shouldn’t be a problem in itself, but now comes the surprise, says Einstein. Because of the symmetry, we know now the position qA of particle A as accurately as we wish. At the same time we know the momentum of particle A as accurately as we wish. In this way, according to Einstein, the Heisenberg uncertainty relation can be circumvented, unless the particles communicate in some way with each other. For instance, that particle A informs B that its momentum has been measured so that particle B has to keep its position uncertain in order to keep up with the uncertainty relation . This communication would have to be instantaneous because otherwise the conservation laws would be temporarily violated. If you measure both particles at the same time, the total result of their momentum and position must still satisfy the conservation laws. Einstein: ‘Es könne keine solche spukhafte Fernwirkung geben‘. So no spooky operation at a distance, please. Niels Bohr’s answer to the challenge Niels Bohr had been confronted with Einstein’s clever thought experiments before and each time he had been able to parry Einstein by pointing out errors in his reasoning. However, this time it was more difficult for Bohr. Bohr’s final answer was “entanglement“. Bohr pointed out that according to the Copenhagen interpretation the quantum wave that describes the behavior of the two particles before they are measured is not a material wave and that that wave is therefore not subjected to the laws of relativity. Relativity theory belongs fully to classical physics, therefore it only applies to matter. Only on measuring one of the particles does the collective quantum wave ‘collapse’ over its entirety. Bohr called this joint quantum state of two particles entanglement. This happens when objects, such as particles, have a shared history. Now think about the Big Bang. Is the universe one single entangled state wave? Some physicists do think so. Anyhow, also read my post ‘Schrödinger’s stopwatch‘. Entanglement plays an important role there. Quantum entanglement is a fully accepted phenomenon today and is used, among other things, to make measurements without directly measuring the measured particle itself. Numerous Bell experiments have confirmed that quantum entanglement exists and is indeed faster than light. That’s especially stunning for people who don’t want to let go of the idea of permanently existing matter, and there are quite a few. If you are still in doubt here, China is not. Chinese scientists take entanglement very seriously and are building a quantum radar system based on entangled radar photons. Revealing quantum experiments have been done with a special type of instrument — the Mach-Zehnder interferometer — which seem to show that quantum objects, such as photons, only exist when they are measured. In order to understand this result, it is necessary to study this type of interferometer in detail first.
<urn:uuid:a2ece729-09f6-4c4a-8371-24588de1ab4e>
CC-MAIN-2023-06
https://quantumphysics-consciousness.eu/index.php/en/quantum-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499966.43/warc/CC-MAIN-20230209112510-20230209142510-00465.warc.gz
en
0.946996
1,367
4.3125
4
Superconducting materials are hailed as the “holy grail” of condensed matter physics since their applications are so extensive. From levitating trains and quantum computing to faster and more efficient classical electronics, superconductivity is heavily researched for the swathe of use cases that could transform by vanquishing electrical resistance and magnetic field. Superconductivity can cause magnetic materials to levitate due to effects on magnetic field lines. Image used courtesy of the University of Rochester Yet, conventional methods to obtain superconductivity are far from economical, requiring massive amounts of energy and cryogenic cooling. Hence, the next step to achieve affordable and useful superconductivity is to reach superconductivity at higher temperatures (any temperature above 90K (−183°C) in superconductors is considered “high”) with the eventual goal being room temperature. Some of the top electrical engineering research institutions have published new findings on this goal in the past few months, with achievements hailing from the University of Rochester, MIT, and Yale. The “World’s First Room-temperature Superconductor” Instead of achieving superconductivity by means of cooling, the researchers were able to achieve this temperature feat by applying extremely high pressures to a hydrogen-rich material that mimics the lightweight and strong-bond characteristics of pure hydrogen—a strong candidate for high-temperature superconductors. This material made of yttrium and hydrogen (“yttrium superhydride”), which can be metalized at significantly lower pressures, exhibited a pressure of 26 million pounds per square inch and a record high temperature of 12°F. The researchers used a diamond anvil cell to test superconducting materials. Image used courtesy of the University of Rochester According to their article in Nature, the team’s next step was to create a “covalent hydrogen-rich organic-derived material” called carbonaceous sulfur hydride. It was this material that then exhibited superconductivity at 58°F by applying 39 million PSI of pressure. For this achievement, lead researcher Ranga Dias was announced as a Time100 Next innovator this past week. MIT Devises a Three-Layer Graphene “Sandwich” While the University of Rochester’s findings are a significant step forward to reach superconductivity, the high pressures required still limit the feasibility of this technique in the real world. Earlier this month, MIT researchers published a paper that describes a method for obtaining superconductivity at high temperatures without requiring immense pressure. A 3 layer graphene “sandwich” has shown superconductive behavior at 3K. Image used courtesy of MIT In 2018, researchers were able to show that when two thin films of graphene are placed on top of one another at a specific angle, the structure actually becomes a superconductor. Since then, the search for more materials sharing this property has proven fruitless—until now. Now, the same MIT researchers have been able to observe superconductivity in a three-layer graphene “sandwich,” the middle layer of which is twisted at a new angle with respect to the outer layers. Compared to the original two-layer superconductive material, which has a critical temperature of 1K, the new three-layer material has shown a critical temperature of 3K. As for the exact reason, the scientists are still unsure. “For the moment we have a correlation, not a causation,” the researchers noted in a university press release. Reimaging Coulomb’s Law for High-temperature Superconductors More superconductor news emerged from Yale University this month, where researchers published a study that challenges fundamental understandings of electromagnetics in superconductors. Their study, which was focused on high-temperature superconductors, found that in this state the behavior of electrons does not follow Coulomb’s law. Normally, two electrons typically repel one another, working to move to the place of lowest energy between one another (which is theoretically infinity). Two equations associated with Coulomb’s law. Image used courtesy of the Physics Hypertextbook Surprisingly, the Yale researchers found that in high-temperature superconductors, electrons behave independently from other atomic particles, creating a ring-like structure with each other. This is fundamentally opposed to previous understandings of Coulomb’s law: instead of moving infinitely away from one another, the electrons move close together, forming a ring-like structure. The researchers theorize that this unprecedented effect may be caused by the “underlying functional form of the Coulomb interaction between valence electrons.” Warming Superconductors Takes Time While the realities of room-temperature superconductors (beyond a stringent lab setting) are far from a reality, the recent studies from these institutions indicate that researchers are on the right trail. “History has taught us that a quest like that can take time,” explains superconductor researcher Van der Molen, professor of condensed matter physics at Leiden University. “Kamerlingh Onnes discovered superconductivity in 1911, but it wasn’t until 1957 that a good explanatory theory was published. . . . It’s complicated, even for physicists.” Catch Up on More Superconductivity Research For lighting, electrical, signage, and technology solutions that allow you to do more call Sverige Energy today at +4(670) 4122522.
<urn:uuid:d7f31615-ab5a-42b0-b8be-5cd8c22fd57e>
CC-MAIN-2023-06
https://sverige.energy/%D1%81ircuits/the-race-toward-room-temperature-superconductors-heats-up
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764495012.84/warc/CC-MAIN-20230127195946-20230127225946-00786.warc.gz
en
0.91384
1,149
4.375
4
When the bizarre world of quantum physics — where a "cat" can be both alive and dead, and particles a galaxy apart are connected — is merged with computer technology, the result is unprecedented power to anyone who masters this technology first. There is an obvious dark side. Imagine a world where online bank accounts could be easily hacked into and robbed. But this power can also be turned to good, allowing new drugs to be designed with unprecedented speed to cure disease. To prepare for such a future, many countries are investing billions (opens in new tab) to unlock the potential of what is called quantum computing. With an eye toward the future, a group of researchers at Fermilab,a particle physics laboratory in Batavia, Ill., has worked with high-school teachers to develop a program to train their students in this emerging field. This program, called "Quantum Computing as a High School Module," was developed in collaboration with young students in mind. But it's also a perfect diversion for science enthusiasts of any age who suddenly have a lot of time on their hands. This online training course introduces students to quantum concepts, including superposition, qubits, encryption, and many others. These additional concepts include quantum measurement, entanglement and teleportation; students will also learn and how to use quantum computers to prevent hacking. The course is also appropriate for community college or undergraduate students in areas outside of physics, such as computer science, engineering or mathematics, as well as a science literate public. One of the course's teachers, Ranbel Sun wrote, "It was great to work with a couple of America's smartest researchers to make sure that the science was right. Combining their knowledge and our teaching experience, we have developed an understandable learning program which bridges the gap between popular media and college textbooks." Quantum computing uses the principles of quantum physics, which were developed in the early 1900s. Quantum physics describes the tiny realm of atoms, where the laws of nature seem to be very different from the world we can see. In this microcosm, electrons and particles of light called photons simultaneously act as both waves and particles — a seeming absurdity, but one that is well accepted among scientists. This non-intuitive quantum behavior has been exploited to develop powerful technologies, like the lasers and transistors that form the backbone of our technological society. Nobel Prize winning physicist Richard Feynman was the first to suggest that computers could be built to directly exploit the laws of quantum mechanics. If successful, these quantum computers could solve incredibly important and difficult problems that are too complex for even the most powerful modern supercomputers to solve. Last year, Google used a quantum computer called Sycamore to solve a problem thought to be virtually unsolvable by conventional computers; a calculation that would take the most powerful supercomputers 10,000 years to finish was solved in just 200 seconds by Sycamore. The familiar computer on your desk uses a vast array of objects called bits to operate. Bits are basically simple switches that can be either on or off, which is mathematically equivalent to ones and zeros. Quantum computers rely on qubits, which can simultaneously be both on and off at the same time. This peculiar feature is common in the quantum world and is called superposition: being in two states at once. Researcher Ciaran Hughes said, "The quantum world is very different from the familiar one, which leads to opportunities not available using classical computers." In 1994, Peter Shor invented an algorithm that revealed the power of quantum computing. His algorithm would allow quantum computers to factorize a number enormously faster than any classically known algorithm. Factorizing numbers is important because the encryption system used by computers to communicate securely relies on the mathematics of prime numbers. Prime numbers are numbers that are divisible only by one and themselves. In a standard encryption algorithm, two very large prime numbers are multiplied together, resulting in an even larger number. The key to breaking the security code is to take the large number and find the two prime numbers that were multiplied together to make it. Finding these prime numbers is extremely hard for ordinary computers and can take centuries to accomplish. However, using Shor's quantum algorithm, finding these prime factors is much easier. A working quantum computer would make our standard method of encryption no longer secure, resulting in the need for new encryption methods. Fermilab researcher Jessica Turner said, "Quantum computing is a very new way of thinking and will be revolutionary, but only if we can develop programmers with quantum intuition." Obviously, any nation state or individual who is able to crack encryption codes will have a huge information advantage. The competition to develop working quantum computers is the new space race. Quantum computing has the potential to overturn how computers securely communicate: from health care, to financial services and online security. Like it or not, the future is quantum computing. To fully reap the rewards of this quantum revolution requires a quantum fluent workforce. This new program is a very helpful step towards that goal. The researchers have made their training program freely available. - The world's most beautiful equations - The 9 most massive numbers in existence - The 18 biggest unsolved mysteries in physics Originally published on Live Science. For a limited time, you can take out a digital subscription to any of our best-selling science magazines for just $2.38 per month, or 45% off the standard price for the first three months.
<urn:uuid:ea65c017-7871-4cb2-b2d8-6bc1b3670864>
CC-MAIN-2023-06
https://www.livescience.com/quantum-computing-students-online-course.html
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500288.69/warc/CC-MAIN-20230205193202-20230205223202-00707.warc.gz
en
0.95861
1,107
3.921875
4
Quantum computing utilize the properties of quantum physics for storing data and for performing computations. It might be highly beneficial for some tasks where they might vastly outperform some of the best supercomputers. Classical computers, on the other hand, encode information in the form of bits that is in 0s and 1s. In a quantum computer, memory in the basic unit is a qubit or a quantum bit. Qubits are created from physical systems, like the spin of the electron or photon orientation. The systems might be in a lot of different settings all at one time; a property called the quantum superposition. The qubits might also be inextricably associated together through a phenomenon known as quantum entanglement. The outcome is a series of qubits that represent different data at the same time. For example, eight bits is sufficient for one classical computer for representing any number that ranges between 0 and 255. However, eight qubits are sufficient for one quantum computer to show each number lying between 0 and 255. Some entangled qubits might be sufficient to show more numbers than there is a number of atoms present in the universe. It is where quantum computing gain an advantage over classical computers. In situations in which there are many possible combinations, quantum computers might consider them at the same time. Some of the examples involve trying to find the total prime factors of a large digit or to find the shortest route between two places. How Do Quantum Computing Work? Instead of the conventional bits, quantum computers utilize qubits. Rather than just being on or off, the qubits might also be in the position called “superposition.” This is the position where they are both on and off simultaneously or somewhere lying in the spectrum. Bits are utilized for representing the information in the regular computers. The quantum computers are dependent on quantum bits, which are qubits that are also able to be made from one electron. Unlike different transistors that might be either 0 or 1, the qubits might come as 0 and 1 at the same time. The capability of a qubits to get the superposition states in the form of different states at the same time, is a great capability in itself for quantum computers. Just like conventional computers, however, quantum computers may need a route for transferring quantum information between the distant seeming qubits, which represents a big experimental challenge. Quantum computers might develop vast multi-dimensional spaces in which we can represent them with even massive issues. On the other hand, classical supercomputers do not hold this capability. Algorithms that utilize quantum wave interference are utilized for finding solutions in space and then utilize them into the different forms that we are able to use and understand. How To Utilize Them? For some of the problems, supercomputers are not the ultimate solution. Until recently, we depended on supercomputers to solve the important issues. These are actually quite large classical computers, which are often with a multitude of classical GPU and CPU cores. However, supercomputers are not quite ideal at solving specific problem types that might seem easy. It is the reason why we may need quantum computers. Larger versions of these issue types might not be solved well by powerful supercomputers due to the fact they do not hold the memory, to capacitate the multitude of combinations of actual life problems. Supercomputers also have to analyze different combinations, one after another, that can require a long time to process. Why Are These Computers Efficient? For some decades, we have seen big companies in the IT sector like IBM who have involved actively in the development of quantum computer systems to solve the issues in new ways. Quantum computers are able to utilize a big multidimensional space in which large issues can be represented. On the other hand, classical supercomputers are not able to do this. Algorithms that utilize quantum wave interference are utilized to look for solutions in the space and then translate them back into the specific forms that we may use or understand. A promising quantum algorithm example would be Grover’s search. For example, let’s say you want to locate an item from a long list containing N items. On a single classical computer, you will need to check half the total items on average, and in the worst-case scenario, you will have to check all the items. Through Grover’s search algorithm, you are able to find the right item after you have checked roughly the root of N of those. It will represent the profound increase in efficiency and the time that is saved. For instance, if you needed to search for an item contained in a list of one trillion, and every item took one microsecond to be checked, then: A conventional computer will take around a week, while a quantum computer will take about a second to complete the search. You don’t have to know the technicalities of these computers in order to avail their full benefits. However, the science that works behind them is quite interesting. It is because it shows a lot of advanced fields that come together in the play of quantum computers. Given the amazing potential of computational resource of quantum computers, you may have the expectation for them to be massive. In fact, these are currently around the size of the domestic refrigerator with an added box the size of a wardrobe, which contains control electronics. Just like how bits are utilized in conventional computers, quantum computers utilize quantum bits or qubits to store information in the form of quantum. These have much better efficiency and performative function as compared to conventional computers, and these can be used to perform a variety of functions with massive data input in a given time simultaneously. This article highlighted the efficiency and the capability of quantum computing over conventional computing in today’s market.
<urn:uuid:1549ffd3-7bc3-4a2f-ba03-b1c2012e006f>
CC-MAIN-2023-06
https://jaisonjacob.com/quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500251.38/warc/CC-MAIN-20230205094841-20230205124841-00346.warc.gz
en
0.938187
1,202
4.25
4
Entanglement is at the heart of quantum physics and future quantum technologies. Like other aspects of quantum science, the phenomenon of entanglement reveals itself at very tiny, subatomic scales. When two particles, such as a pair of photons or electrons, become entangled, they remain connected even when separated by vast distances. In the same way that a ballet or tango emerges from individual dancers, entanglement arises from the connection between particles. It is what scientists call an emergent property. How do scientists explain quantum entanglement? In the video below, Caltech faculty members take a stab at explaining entanglement. Featured: Rana Adhikari, professor of physics; Xie Chen, professor of theoretical physics; Manuel Endres, professor of physics and Rosenberg Scholar; and John Preskill, Richard P. Feynman Professor of Theoretical Physics, Allen V. C. Davis and Lenabelle Davis Leadership Chair, and director of the Institute for Quantum Information and Matter. When researchers study entanglement, they often use a special kind of crystal to generate two entangled particles from one. The entangled particles are then sent off to different locations. For this example, let's say the researchers want to measure the direction the particles are spinning, which can be either up or down along a given axis. Before the particles are measured, each will be in a state of superposition, or both "spin up" and "spin down" at the same time. If the researcher measures the direction of one particle's spin and then repeats the measurement on its distant, entangled partner, that researcher will always find that the pair are correlated: if one particle's spin is up, the other's will be down (the spins may instead both be up or both be down, depending on how the experiment is designed, but there will always be a correlation). Returning to our dancer metaphor, this would be like observing one dancer and finding them in a pirouette, and then automatically knowing the other dancer must also be performing a pirouette. The beauty of entanglement is that just knowing the state of one particle automatically tells you something about its companion, even when they are far apart. Are particles really connected across space? But are the particles really somehow tethered to each other across space, or is something else going on? Some scientists, including Albert Einstein in the 1930s, pointed out that the entangled particles might have always been spin up or spin down, but that this information was hidden from us until the measurements were made. Such "local hidden variable theories" argued against the mind-boggling aspect of entanglement, instead proposing that something more mundane, yet unseen, is going on. Thanks to theoretical work by John Stewart Bell in the 1960s, and experimental work done by Caltech alumnus John Clauser (BS '64) and others beginning in the 1970s, scientists have ruled out these local hidden-variable theories. A key to the researchers' success involved observing entangled particles from different angles. In the experiment mentioned above, this means that a researcher would measure their first particle as spin up, but then use a different viewing angle (or a different spin axis direction) to measure the second particle. Rather than the two particles matching up as before, the second particle would have gone back into a state of superposition and, once observed, could be either spin up or down. The choice of the viewing angle changed the outcome of the experiment, which means that there cannot be any hidden information buried inside a particle that determines its spin before it is observed. The dance of entanglement materializes not from any one particle but from the connections between them. Relativity Remains Intact A common misconception about entanglement is that the particles are communicating with each other faster than the speed of light, which would go against Einstein's special theory of relativity. Experiments have shown that this is not true, nor can quantum physics be used to send faster-than-light communications. Though scientists still debate how the seemingly bizarre phenomenon of entanglement arises, they know it is a real principle that passes test after test. In fact, while Einstein famously described entanglement as "spooky action at a distance," today's quantum scientists say there is nothing spooky about it. "It may be tempting to think that the particles are somehow communicating with each other across these great distances, but that is not the case," says Thomas Vidick, a professor of computing and mathematical sciences at Caltech. "There can be correlation without communication," and the particles "can be thought of as one object." Entanglement can also occur among hundreds, millions, and even more particles. The phenomenon is thought to take place throughout nature, among the atoms and molecules in living species and within metals and other materials. When hundreds of particles become entangled, they still act as one unified object. Like a flock of birds, the particles become a whole entity unto itself without being in direct contact with one another. Caltech scientists focus on the study of these so-called many-body entangled systems, both to understand the fundamental physics and to create and develop new quantum technologies. As John Preskill, Caltech's Richard P. Feynman Professor of Theoretical Physics, Allen V. C. Davis and Lenabelle Davis Leadership Chair, and director of the Institute for Quantum Information and Matter, says, "We are making investments in and betting on entanglement being one of the most important themes of 21st-century science."
<urn:uuid:bde92f40-b69a-405e-8b52-b2cdaf7473f4>
CC-MAIN-2023-06
https://scienceexchange.caltech.edu/topics/quantum-science-explained/entanglement?utm_source=csequantum&utm_medium=caltechnews&utm_campaign=web
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764501555.34/warc/CC-MAIN-20230209081052-20230209111052-00827.warc.gz
en
0.934579
1,136
3.53125
4
Computers and similar electronic devices have gotten faster and smaller over the decades as computer-chip makers have learned how to shrink individual transistors, the tiny electrical switches that convey digital information. Scientists’ pursuit of the smallest possible transistor has allowed more of them to be packed onto each chip. But that race to the bottom is almost over: Researchers are fast approaching the physical minimum for transistor size, with recent models down to about 10 nanometers — or just 30 atoms — wide. “The processing power of electronic devices comes from the hundreds of millions, or billions, of transistors that are interconnected on a single computer chip,” said Dr. Kyeongjae Cho, professor of materials science and engineering at The University of Texas at Dallas. “But we are rapidly approaching the lower limits of scale.” To extend the quest for faster processing speed, the microelectronics industry is looking for alternative technologies. Cho’s research, published online April 30 in the journal Nature Communications, might offer a solution by expanding the vocabulary of the transistor. Conventional transistors can convey just two values of information: As a switch, a transistor is either on or off, which translates into the 1s and 0s of binary language. One way to increase processing capacity without adding more transistors would be to increase how much information each transistor conveys by introducing intermediate states between the on and off states of binary devices. A so-called multi-value logic transistor based on this principle would allow more operations and a larger amount of information to be processed in a single device. “The concept of multi-value logic transistors is not new, and there have been many attempts to make such devices,” Cho said. “We have done it.” Through theory, design and simulations, Cho’s group at UT Dallas developed the fundamental physics of a multi-value logic transistor based on zinc oxide. Their collaborators in South Korea successfully fabricated and evaluated the performance of a prototype device. Cho’s device is capable of two electronically stable and reliable intermediate states between 0 and 1, boosting the number of logic values per transistor from two to three or four. Cho said the new research is significant not only because the technology is compatible with existing computer-chip configurations, but also because it could bridge a gap between today’s computers and quantum computers, the potential next landmark in computing power. While a conventional computer uses the precise values of 1s and 0s to make calculations, the fundamental logic units of a quantum computer are more fluid, with values that can exist as a combination of 1s and 0s at the same time or anywhere in between. Although they have yet to be realized commercially, large-scale quantum computers are theorized to be able to store more information and solve certain problems much faster than current computers. “A device incorporating multi-level logic would be faster than a conventional computer because it would operate with more than just binary logic units. With quantum units, you have continuous values,” Cho said. “The transistor is a very mature technology, and quantum computers are nowhere close to being commercialized,” he continued. “There is a huge gap. So how do we move from one to the other? We need some kind of evolutionary pathway, a bridging technology between binary and infinite degrees of freedom. Our work is still based on existing device technology, so it is not as revolutionary as quantum computing, but it is evolving toward that direction.” “The concept of multi-value logic transistors is not new, and there have been many attempts to make such devices. We have done it.” The researchers discovered they could achieve the physics needed for multi-value logic by embedding zinc oxide crystals, called quantum dots, into amorphous zinc oxide. The atoms comprising an amorphous solid are not as rigidly ordered as they are in crystalline solids. “By engineering this material, we found that we could create a new electronic structure that enabled this multi-level logic behavior,” said Cho, who has applied for a patent. “Zinc oxide is a well-known material that tends to form both crystalline solids and amorphous solids, so it was an obvious choice to start with, but it may not be the best material. Our next step will look at how universal this behavior is among other materials as we try to optimize the technology. “Moving forward, I also want to see how we might interface this technology with a quantum device.” The Latest on: Multi-value logic transistors [google_news title=”” keyword=”multi-value logic transistors” num_posts=”10″ blurb_length=”0″ show_thumb=”left”] via Google News The Latest on: Multi-value logic transistors - Did Elon Musk Warn that 'Woke Mind Virus' Is Destroying Civilization?on January 29, 2023 at 4:49 pm In January 2023, a YouTube ad from the far-right website The Epoch Times read, "Musk Warns About 'Woke Mind Virus' Entertainment Triggering Civilizational Suicide." It's true that Twitter CEO Elon ... - Measles virus 'cooperates' with itself to cause fatal encephalitison January 26, 2023 at 4:01 pm Mutation in the F protein is key for the measles virus to fuse and infect neurons. Two primary strategies exist for such infection. Initially, fusion activity of a mutant F protein is suppressed ... - Researchers discover how measles virus can cause a rare but fatal neurological disorderon January 26, 2023 at 4:01 pm Researchers in Japan have uncovered the mechanism for how the measles virus can cause subacute sclerosing panencephalitis, or SSPE, a rare but fatal neurological disorder that can occur several ... - Experimental vaccine for deadly Marburg virus guards against infection with just a single doseon January 26, 2023 at 8:41 am An experimental vaccine for Marburg virus—a deadly cousin of the infectious agent that causes Ebola—can protect large animals from severe infections for up to a year with a single shot ... - Study reveals the inevitability of Zika virus dumbbell-1 structure for viral pathogenesison January 26, 2023 at 8:37 am In a recent study posted to the bioRxiv* server, researchers designed two dumbbell-1 (DB-1) mutant Zika virus (ZIKV) infectious clones termed ZIKV-TL.PK and ZIKV-p.2.5’. While the former ... - A transnational collaboration leads to the characterization of an emergent plant viruson January 25, 2023 at 4:00 pm Physostegia chlorotic mottle virus (PhCMoV), a plant disease first identified in Austria in 2018, initially received inadequate characterization. This then sparked studies across Europe as new ... - Virus Leaves FaZe Clanon January 25, 2023 at 1:04 pm One of the best FPS players from the Middle East, Virus, announces his departure from FaZe Clan and explains why he's leaving. Virus set his sights on becoming a member of the FaZe Clan in 2019 ... - Fact check: No, coronavirus is not a Latin word for 'heart attack virus'on January 24, 2023 at 10:25 am The claim: Coronavirus is a Latin word for 'heart attack virus' A Jan. 6 Facebook video (direct link, archive link) shows a woman purporting to use Google to translate Latin to English. - Virus plus microplastics equal double whammy for fish healthon January 22, 2023 at 4:00 pm The team wanted to determine if a "cause-and-effect" may occur between microplastics, virus, and fish mortality. Seeley and colleagues thus exposed aquarium-kept rainbow trout to low, medium ... - World-first computational reconstruction of a virus in its biological entiretyon January 19, 2023 at 4:00 pm An Aston University researcher has created the first ever computer reconstruction of a virus, including its complete native genome. Although other researchers have created similar reconstructions ... via Bing News
<urn:uuid:431d3d73-1780-4fb7-b1a0-c0c33719d187>
CC-MAIN-2023-06
https://innovationtoronto.com/2019/06/why-shrink-transistors-any-further-when-you-could-do-this/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500058.1/warc/CC-MAIN-20230203154140-20230203184140-00788.warc.gz
en
0.924694
1,730
3.875
4
Researchers from Chalmers University of Technology, Sweden, have uncovered a striking new behavior of the ‘strange metal’ state of high temperature superconductors. The discovery represents an important piece of the puzzle for understanding these materials, and the findings have been published in the highly prestigious journal Science. Superconductivity, where an electric current is transported without any losses, holds enormous potential for green technologies. For example, if it could be made to work at high enough temperatures, it could allow for lossless transport of renewable energy over great distances. Investigating this phenomenon is the aim of the research field of high temperature superconductivity. The current record stands at −130 degrees celsius, which might not seem like a high temperature, but it is when compared to standard superconductors which only work below −230 degrees celsius. While standard superconductivity is well understood, several aspects of high temperature superconductivity are still a puzzle to be solved. The newly published research focusses on the least understood property – the so called ‘strange metal’ state, appearing at temperatures higher than those that allow for superconductivity. “This ‘strange metal’ state is aptly named. The materials really behave in a very unusual way, and it is something of a mystery among researchers. Our work now offers a new understanding of the phenomenon. Through novel experiments, we have learned crucial new information about how the strange metal state works” says Floriana Lombardi, Professor at the Quantum Device Physics Laboratory at the Department of Microtechnology and Nanoscience at Chalmers. Believed to be based on quantum entanglement The strange metal state got its name because its behavior when conducting electricity is, on the face of it, far too simple. In an ordinary metal, lots of different processes affect the electrical resistance – electrons can collide with the atomic lattice, with impurities, or with themselves, and each process has a different temperature dependence. This means that the resulting total resistance becomes a complicated function of the temperature. In sharp contrast, the resistance for strange metals is a linear function of temperature – meaning a straight line from the lowest attainable temperatures up to where the material melts. “Such a simple behavior begs for a simple explanation based on a powerful principle, and for this type of quantum materials the principle is believed to be quantum entanglement.” says Ulf Gran, Professor at the Division of Subatomic, High-Energy and Plasma Physics at the Department of Physics at Chalmers. “Quantum entanglement is what Einstein called ‘spooky action at a distance’ and represents a way for electrons to interact which has no counterpart in classical physics. To explain the counterintuitive properties of the strange metal state, all particles need to be entangled with each other, leading to a soup of electrons in which individual particles cannot be discerned, and which constitutes a radically novel form of matter.” Exploring the connection with charge density waves The key finding of the paper is that the authors discovered what kills the strange metal state. In high temperature superconductors, charge density waves (CDW), which are ripples of electric charge generated by patterns of electrons in the material lattice, occur when the strange metal phase breaks down. To explore this connection, nanoscale samples of the superconducting metal yttrium barium copper oxide were put under strain to suppress the charge density waves. This then led to the re-emergence of the strange metal state. By straining the metal, the researchers were able to thereby expand the strange metal state into the region previously dominated by CDW – making the ‘strange metal’ even stranger. “The highest temperatures for the superconducting transition have been observed when the strange metal phase is more pronounced. Understanding this new phase of matter is therefore of utmost importance for being able to construct new materials that exhibit superconductivity at even higher temperatures,” explains Floriana Lombardi. The researchers’ work indicates a close connection between the emergence of charge density waves and the breaking of the strange metal state – a potentially vital clue to understand the latter phenomenon, and which might represent one of the most striking evidence of quantum mechanical principles at the macro scale. The results also suggest a promising new avenue of research, using strain control to manipulate quantum materials. For more information, contact: Professor in Microtechnology and Nanoscience, Chalmers University of Technology +46 31 772 3318 Chalmers University of Technology in Gothenburg, Sweden, conducts research and education in technology and natural sciences at a high international level. The university has 3100 employees and 10,000 students, and offers education in engineering, science, shipping and architecture. With scientific excellence as a basis, Chalmers promotes knowledge and technical solutions for a sustainable world. Through global commitment and entrepreneurship, we foster an innovative spirit, in close collaboration with wider society.The EU’s biggest research initiative – the Graphene Flagship – is coordinated by Chalmers. We are also leading the development of a Swedish quantum computer. Chalmers was founded in 1829 and has the same motto today as it did then: Avancez – forward.
<urn:uuid:a3e996d4-80d2-4696-a3c5-361f9e0b36c2>
CC-MAIN-2023-06
https://welum.com/article/strange-metal-state/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764495012.84/warc/CC-MAIN-20230127195946-20230127225946-00788.warc.gz
en
0.910262
1,121
3.671875
4
Photon Qubit is Made of Two Colors The discovery of the photon, the quantum particle of light, played a key role in the development of quantum physics. Today, photons are among the most advanced building blocks for quantum technologies, such as quantum computing , secure communication , and precision measurement . These applications typically rely on quantum control of a photon’s polarization or its spatial mode. Surprisingly, the most manifest property of light—its color or frequency—is difficult to manipulate on the quantum level. An experiment now demonstrates a toolbox for creating, manipulating, and detecting single photons in a quantum superposition of two discrete frequencies . The approach requires an interaction between different frequency components of light, which Stéphane Clemmen from Cornell University, New York, and colleagues have achieved by making use of nonlinear processes in optical fibers. Such photonic quantum bits (qubits) could be useful for connecting quantum systems operating at different frequencies in a quantum network. According to quantum physics, monochromatic light of frequency , such as the light emitted by a laser, is composed of photons of energy , where is the Planck constant. Polychromatic light, such as the light emitted by the Sun, contains photons of many different frequencies. However, each individual photon usually has a well-defined frequency and energy. Interestingly, the superposition principle of quantum physics allows for yet another version of polychromatic light: a single photon in a superposition of two discrete frequencies and . In this case, neither the frequency nor the energy of the photon is well defined. In some sense, such a “bichromatic” photon can be thought of as having two different colors at the same time, only one of which would be revealed if the photon were measured by a spectrometer or detected by eye. However, the creation and manipulation of bichromatic photons turns out to be challenging. The difficulty is that such processes require an interaction between photons of different frequencies, and in most media, light beams do not interact. The situation changes if light propagates in a nonlinear medium, where the optical properties vary with the intensity of light. The nonlinear response results in an interaction between photons, providing a means with which to convert them to a different frequency. To make this process efficient, however, the medium must be pumped with separate high-power laser beams. These beams are tuned to a different frequency than the weak single-photon signal, but care must be taken to avoid noise processes that generate additional photons in the signal channel. Several experiments have succeeded in creating suitable single-photon nonlinearities. Clemmen et al. build on this work by not only creating bichromatic photons, but also by manipulating them and demonstrating their coherence. In their experiments, Clemmen et al. realize the efficient frequency conversion of single photons with a process called Bragg-scattering four-wave mixing. The main ingredient in this approach is a 100-m-long fiber, which the authors pump with two laser beams of unequal frequencies to obtain the required nonlinear response. The frequency difference between the beams determines the difference between the frequencies and involved in the final single-photon superposition. The team achieves the necessary low noise level by cooling the fiber in a cryostat. Within this setup, sending a single photon of frequency and quantum state through the fiber converts the photon state into a superposition, , where is the mixing angle and is the relative phase between the frequency components. The angles and can be adjusted by tuning the amplitude and phase of the pump lasers, allowing the researchers to encode one bit of quantum information on the photon (see representation at the top of Fig. 1). Importantly, Clemmen et al. prove that they have generated a coherent superposition rather than an incoherent mixture in which photons randomly acquire one of the two frequencies. To do so, the researchers perform Ramsey spectroscopy, a technique commonly used to measure coherence in atomic clocks or nuclear magnetic resonance (NMR). The Ramsey sequence is illustrated in Fig. 1. The researchers adjust the pump lasers such that after the photon passes through the fiber it is in an equal superposition of the two photon frequencies, or . In the terminology of NMR, this corresponds to a pulse. Subsequently, they adjust the phase, , by introducing a propagation delay. Finally, they send the photon through the fiber a second time, corresponding to a second pulse, which converts the state to . To analyze the state, they separate the two frequency components and detect each with a single-photon detector. When they vary the propagation delay, and therefore , the probability of detecting the photon oscillates sinusoidally between the two detectors, as expected. The observed contrast of these “Ramsey fringes” reaches up to 65%, proof that a coherent superposition is being generated. Moreover, the researchers show that there is only a small probability of detecting more than one photon at a time, confirming that their superposition state preserves the single-photon character of the initial state. It is very appealing to see that a single particle of light can be in a superposition of two different colors. So far, the wavelengths involved are in the infrared near 1280 nm—outside the range of human vision—and differ by about 4 nm. But when translated to the visible spectrum (roughly 380–780 nm), such a wavelength difference could be discriminated with the bare eye. In the future, the methods that Clemmen et al. have demonstrated could be used to interface quantum systems operating at different frequencies, such as solid-state and atomic quantum memories [5–8]. One could envision two physically different quantum memories, each absorbing one part of the single photon in a frequency superposition. This would entangle the quantum memories because they share a single excitation in a coherent way. Such a protocol may be useful for creating quantum networks [9, 10], which could be the basis for quantum communication, computing, and simulation. Another application of the bichromatic qubits could be spectroscopy that requires only small amounts of light: the idea would be to look for spectrally dependent phase changes in the qubits’ states. These applications would benefit from extending the technique demonstrated by Clemmen et al. to larger frequency or wavelength differences. - P. Kok, W. J. Munro, K. Nemoto, T. C. Ralph, J. P. Dowling, and G. J. Milburn, “Linear Optical Quantum Computing with Photonic Qubits,” Rev. Mod. Phys. 79, 135 (2007). - N. Gisin and R. Thew, “Quantum Communication,” Nature Photon. 1, 165 (2007). - J. Aasi et al. (The LIGO Scientific Collaboration), “Enhanced sensitivity of the LIGO Gravitational Wave Detector by Using Squeezed States of Light,” Nature Photon. 7, 613 (2013). - S. Clemmen, A. Farsi, S. Ramelow, and A. Gaeta, “Ramsey Interference with Single Photons,” Phys. Rev. Let. 117, 223601 (2016). - I. Usmani, C. Clausen, F. Bussières, N. Sangouard, M. Afzelius, and N. Gisin, “Heralded Quantum Entanglement Between Two Crystals,” Nature Photon. 6, 234 (2012). - A. G. Radnaev, Y. O. Dudin, R. Zhao, H. H. Jen, S. D. Jenkins, A. Kuzmich, and T. A. B. Kennedy, “A Quantum Memory with Telecom-Wavelength Conversion,” Nature Phys. 6, 894 (2010). - M. T. Rakher, L. Ma, M. Davanço, O. Slattery, X. Tang, and K. Srinivasan, “Simultaneous Wavelength Translation and Amplitude Modulation of Single Photons from a Quantum Dot,” Phys. Rev. Lett. 107, 083602 (2011). - J.-P. Jahn, M. Munsch, M. Davanço, O. Slattery, X. Tang, and K. Srinivasan, “An Artificial Rb Atom in a Semiconductor with Lifetime-Limited Linewidth,” Phys. Rev. B 92, 083602 (2015). - H. J. Kimble, “The Quantum Internet,” Nature 453, 1023 (2008). - N. Sangouard, C. Simon, H. de Riedmatten, and N. Gisin, “Quantum Repeaters Based on Atomic Ensembles and Linear Optics,” Rev. Mod. Phys. 83, 33 (2011).
<urn:uuid:cf9bca67-71a7-4f90-9e99-2f87e883081a>
CC-MAIN-2023-06
https://physics.aps.org/articles/v9/135
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500813.58/warc/CC-MAIN-20230208123621-20230208153621-00109.warc.gz
en
0.903931
1,877
4.125
4
KENSINGTON, Australia — Quantum computing could soon become a reality that changes digital technology forever after a milestone achievement by researchers in Australia. The team has proven that virtually error-free computer operations are possible using a silicon-based quantum device. Moreover, scientists found it’s possible to build these lightning-fast computers using current semiconductor manufacturing technology available today. “Today’s publication shows our operations were 99 percent error-free,” says Professor Andrea Morello from the University of New South Wales-Sydney in a release. “When the errors are so rare, it becomes possible to detect them and correct them when they occur. This shows that it is possible to build quantum computers that have enough scale, and enough power, to handle meaningful computation.” Morello, who leads a team of researchers from the United States, Japan, Egypt, and Australia, is building what they call a “universal quantum computer” that is capable of performing more than one application. “This piece of research is an important milestone on the journey that will get us there,” Prof. Morello adds. Why are quantum computers so special? In a nutshell, quantum computers find better and quicker ways to solve problems. Scientists believe quantum technology could solve extremely complex problems in seconds, while traditional supercomputers you see today could need months or even years to crack certain codes. What makes these next generation supercomputers different from your everyday smartphone and laptop is how they process data. Quantum computers harness the properties of quantum physics to store data and perform their functions. While traditional computers use “bits” to encode information on your devices, quantum technology uses “qubits.” The main difference between these two are that bits process information in binary fashion — meaning something is either a “0” or “1” or a yes/no answer. They represent this two-choice system through the absence or presence of an electrical signal in the computer. Qubits, on the other hand, use quantum objects which act as information processors — such as spin (controlling the spin of charged particles in a semiconductor), trapped atoms or ions, photons (particles of light), or semiconducting circuits. Like a bit, qubits also have two distinctive states representing “0” and “1,” but they are also capable of working in “superposition” states as well. A qubit can account for incompatible measurements (beyond 0 and 1) and even entangle with other qubits. All this makes them incredibly more powerful than the average computer bit. Cracking the 99 percent threshold The new study actually features three separate reports which detail the researchers’ breakthrough into super-accurate quantum computing. Prof. Morello’s team achieved a one-qubit operation fidelity of 99.95 percent, meaning a qubit’s ability to successfully pass a test. They also achieved a two-qubit fidelity of 99.37 percent. The team conducted this test using a three-qubit system consisting of an electron and two phosphorous atoms inside silicon. Another team in the Netherlands reached the 99 percent accuracy threshold using qubits consisting of electron spins in a stack of silicon and silicon-germanium alloy (Si/SiGe). Finally, a third team in Japan broke the 99 percent barrier with a two-electron system using Si/SiGe quantum dots. Scientists are focusing on using qubits in silicon because of their stability and capability to hold quantum information for long periods of time. Prof. Morello’s previous studies demonstrated that he could preserve quantum data in silicon for 35 seconds. That may not sound like a lot to the average person, but it’s nearly a lifetime for quantum computers. “In the quantum world, 35 seconds is an eternity,” Prof. Morello explains. “To give a comparison, in the famous Google and IBM superconducting quantum computers the lifetime is about a hundred microseconds – nearly a million times shorter.” Scientists discover how to make qubits interact with each other The biggest breakthrough in the study, researchers say, is overcoming the need to isolate individual qubits in the computing process. Until now, it’s been seemingly impossible for qubits to interact with each other. The team used an electron encompassing two nuclei of phosphorus atoms to overcome this problem. “If you have two nuclei that are connected to the same electron, you can make them do a quantum operation,” says study author Mateusz Mądzik. “While you don’t operate the electron, those nuclei safely store their quantum information. But now you have the option of making them talk to each other via the electron, to realize universal quantum operations that can be adapted to any computational problem.” “This really is an unlocking technology,” adds Dr. Serwan Asaad. “The nuclear spins are the core quantum processor. If you entangle them with the electron, then the electron can then be moved to another place and entangled with other qubit nuclei further afield, opening the way to making large arrays of qubits capable of robust and useful computations.” With this breakthrough, study authors say semiconductor spin qubits in silicon could soon become the platform of choice as scientists build the next wave of reliable quantum computers. “Until now, however, the challenge has been performing quantum logic operations with sufficiently high accuracy,” Prof. Morello concludes. “Each of the three papers published today shows how this challenge can be overcome to such a degree that errors can be corrected faster than they appear.” The findings are published in the journal Nature.
<urn:uuid:3b634200-1efb-4e40-915e-069395aef4b6>
CC-MAIN-2023-06
https://studyfinds.org/quantum-computing-accuracy/?utm_campaign=Study%20Time%21%20Latest%20Articles%20From%20StudyFinds&utm_medium=email&utm_source=Revue%20newsletter
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500250.51/warc/CC-MAIN-20230205063441-20230205093441-00229.warc.gz
en
0.920487
1,203
3.703125
4
As our demand for powerful processors rises, our need for a solution outside classical computing mounts. Quantum computing could help solve some of the more complex problems plaguing us. With quantum computers, we could map complex climate systems, solve impossibly complex encryption puzzles, and simulate advanced chemical processes. And this is just the tip of the iceberg. As a result, there is an international race to arrive at a scalable, commercial quantum computer. Researchers all over the world are working diligently to find the perfect material to harness the power of a quantum bit, or qubit. How does a quantum bit differ from a normal bit? In a classical computer, bits have discrete states. These bits are either 0 or 1. A pulse of energy, either in the form of 0 -- which is not the same as a complete lack of electrical impulse -- or a 1, is sent through transistors. These strings of zeroes and ones are simply instructions to the hardware. A string of these commands make up a byte. A string of bytes make up kilobytes, megabytes, gigabytes, terabytes and so on and so forth. The software your computer uses is equipped to translate the commands supplied by a stream of bits. So, when a programmer creates a program in C++ or Java, for example, the words and phrases and mathematical equations she uses can all be reduced to zeroes and ones. As you have probably already surmised, for the most part, a bit can be either one or the other: 1 or 0. There are two choices and no more. This is why coding in ones and zeroes is often referred to as binary code, and where the name bit, which stands for binary digit, comes from. In quantum mechanics, there is something called superposition. Many are familiar with the famous Schrödinger's Cat thought experiment, in which an unseen cat is both alive and dead at the same time. In our everyday physical world, this idea does not make too much sense. A cat is either alive or dead. In other words, organic life is binary. Classical physics and classical computers follow this logic. They live in a binary world. Qubits, however, are not bound to binary. Qubits can be either 0 or 1, or a combination of those states. This is the principle of superposition at work. “A qubit can be thought of like an imaginary sphere,” writes Abigail Beall for Wired. “Whereas a classical bit can be in two states - at either of the two poles of the sphere - a qubit can be any point on the sphere.” What’s so intriguing about qubits is their interactions with each other. In quantum computing, the sum is much larger than its individual parts. “Every time I add a quantum bit to a quantum computer, I double the computational power,” explains Michelle Simmons, the lead quantum researcher at the University of New South Wales, in a recent talk. “It’s predicted that… a 30-qubit computer... will be more powerful than the world’s most powerful supercomputer.” These quantum computers would be powerful enough to run sophisticated AI programs that could disrupt finance, medicine, and engineering industries. To put this 30-qubit figure in context, IBM is, at present, leading the charge with a 16-qubit chip. “IBM Q has successfully built and tested two of its most powerful universal quantum computing processors to date,” IBM boasts. “16 qubits for public use and a 17-qubit prototype commercial processor.” IBM, Microsoft, and Google are making these qubit chips by submerging superconductors in subzero temperatures. Simmons and her team are simply imprinting atoms in silicon. Other researchers have taken a more rigorous approach. They applied the theory of time crystals, an idea proposed in 2012 by Nobel laureate Frank Wilczek, to build their quantum technologies. The University of Maryland and Harvard University synthesized time crystals in their own research labs, using disparate approaches. In University of Maryland’s system, they use an ion trap system to form patterns in time. Harvard exploited flaws in diamonds, a spatial crystal, to synthesize a discrete time crystal of their own. What could these sci-fi-sounding time crystals be used for? “Time crystals,” Wilczek said in a recent presentation to university students, “are just what the doctor ordered for this technology.” Indeed, quantum computers are sensitive and require a very precise global clock, a potential use for the new type of matter. Further, time crystals could be used for information tasks and quantum memory. Qubits, superconductors, and time crystals. As far-flung as the future of quantum computing may appear to be, it is likely closer than you think. We need more computationally complex machines to power our most pressing problems. Quantum computers could lead us to some very compelling solutions. About the author: Josh Althauser is an entrepreneur with a background in design and M&A. He's also a developer, open source advocate, and designer. You may connect with him on Twitter. Antivirus software is not enough. Apex Technology Services used its decades of IT and cybersecurity experience to create budget-friendly network security packages every company needs. Please take a moment to fill out your information so we can contact you directly regarding your request. Generative AI Expo is the starting point for you research the countless potentially game-changing pillars that may solidify generative AI as THE indis… A machine learning model is a mathematical representation of a system or process that is trained to make predictions or decisions based on data. It is… Continuous deployment (CD) is a software development practice where code changes are automatically built, tested, and deployed to production without h… A master data management (MDM) platform is invaluable for any business. By centralizing data into one cohesive system, companies can improve their ope… An SBOM, or software bill of materials, is a list of all the components and dependencies that make up a piece of software. This can include things lik…
<urn:uuid:80bfdb80-c1e0-4f09-b030-425eb938c58e>
CC-MAIN-2023-06
https://www.techzone360.com/topics/techzone/articles/2018/02/13/436983-everything-need-know-quantum-computing.htm
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499646.23/warc/CC-MAIN-20230128153513-20230128183513-00510.warc.gz
en
0.923763
1,286
4.15625
4
You may have been reading about breakthroughs in the area of quantum computing, including Google’s announcement that it had created a “time crystal” — a new form of matter — in a quantum computer. The truth, however, is that today’s quantum computers have limitations. Which is why Norman Yao, a molecular physicist at the University of California at Berkeley, states, “Time crystals are like a rest stop on the road to building a [better (i.e., more functional)] quantum computer.” Journalist Dalvin Brown (@dalvin_brown) reports, “Time crystals are scientific oddities made of atoms arranged in a repeating pattern in space. This design enables them to shift shape over time without losing energy or overheating. Since time crystals continuously evolve and don’t seem to require much energy input, they may be useful for quantum computers, which rely on extremely fragile qubits that are prone to decay. Quantum computing is weighed down by hard-to-control qubits, which are error prone and often die. Time crystals might introduce a better method for sustaining quantum computing, according to Yao, who published a blueprint for making time crystals in 2017.” Even though scientists remain in the early stages of developing a more functional quantum computer, futurists are already thinking about how useful they will be in the supply chain sector. What are Quantum Computers? Unlike traditional computers where a bit is either a one or a zero, a quantum bit (or qubit) can simultaneously be both a one and a zero. Like many things found at the quantum level, qubits defy logic. The weirdness continues with the fact that a quantum particle can simultaneously appear to be in two places at once. This phenomenon — called entanglement — involves a pair of quantum particles linked together in a such a way that when one particle is altered its twin is instantaneously altered in exactly the same way regardless of how far apart the entangled particles may be. Professor Albert Einstein famously called entanglement “spooky action at a distance.” Because a qubit can simultaneously be both a 0 and a 1, quantum machines compute differently than traditional machines. James Norman explains, “Quantum computers can be game changers because they can solve important problems no existing computer can. While conventional computing scales linearly, QC scales exponentially when adding new bits. Exponential scaling always wins, and it’s never close.” To learn more about quantum computers, watch the following video. Eric Limer insists, “Someday, somehow, quantum computing is going to change the world as we know it. Even the lamest quantum computer is orders of magnitude more powerful than anything we could ever make today. But figuring out how to program one is ridiculously hard.” How Can Quantum Computers Improve Logistics? Almost weekly, scientists are making breakthroughs that are paving the way to more functional quantum computers — including how to program them. As a result, more people are beginning to consider how a quantum computer could be used to improve logistics. Robert Liscouski, President and CEO of Quantum Computing Inc., explains, “World events ranging from the Suez Canal blockage to the global COVID-19 pandemic have shown how susceptible our supply chain management and logistics systems are to changes in consumer and business demand, raw materials availability, shipping, and distribution. The field of constrained optimization is well matched to address these needs, yet today’s classical computers can hit a wall amidst growing volumes of data and unpredictable disruptions. New software solutions will combine the power of classical and quantum computing to help planners stay ahead.” If you are not familiar with “constrained optimization,” Liscouski explains that constrained optimization is a field of mathematics that addresses problems in which “optimizing a function’s variables (e.g., trucks, SKUs, people)” are important, but optimized solutions must take “into account their constraints (e.g., cost, volume, time), for better business decision-making and efficiency.” In a subsequent article, Liscouski notes, “The goal of a supply chain organization is to meet customer requirements while minimizing total supply chain costs. Businesses must be flexible enough to respond quickly when disruptions occur.” In the area of logistics, he notes, being agile isn’t easy. This is especially true, he states, when it comes to last mile logistics. He explains, “The last mile grows even more complex. The last mile has always been the most expensive, long-bemoaned challenge of the supply chain. With the ‘new normal’ of changing consumption habits and channels creating unpredictable demand, forecasts have become meaningless. This makes agility and speed to optimization that much more important to meet customers’ growing expectations for instant availability and near-immediate delivery.” To demonstrate why quantum computers could help logistics optimization, Liscouski provides an example: “Many of us have heard of the traveling salesman problem, which can be compared to truck routing and how to optimize the routes, as well as the trucks. The challenge is that traveling salesman problems like this grow in complexity by n! (n factorial). Routing problems are more constrained and complex for every variable (truck, route, driver, etc.) that you add. For example, a traveling salesman problem that has 10 stops results in 3,628,800 route options, 40 stops will result in approximately 40! = 815,915,283,2 00,000,000,000,000,000,000,000,000,000,000,000,000 options. Routing multiple trucks and packages is even more complex.” As a result, Liscouski writes, “A classical computer would struggle under the weight and scale of a vast set of possibilities. This is where quantum computers promise to take on the task to quickly produce options to choose from to make the best decision based on your goals. Complicated scenarios meant to solve for multiple variables are not achievable by a classical computing algorithm in a short span of time. However, algorithms using quantum computing techniques can quickly achieve this simulation using a classical system applying quantum techniques, or a hybrid solution that employs both quantum and classical, today.” Liscouski notes that both Accenture and IDC insist quantum computing will benefit supply chain optimization efforts. He reports that Accenture concludes, “Route-optimization algorithms are helping reduce mileage and improve on-time delivery rates. In logistics, quantum routing uses cloud-based, quantum computing to calculate the fastest route for all vehicles, taking into account millions of real-time data points about traffic congestion.” And IDC research concludes, “The ability to ingest broad and deep data sets to inform better decision making will be the single largest differentiator of supply chain performance in the future.” Although it comes as no surprise that Liscouski, whose company offers quantum software, is sanguine about quantum computing’s future in the supply chain, I agree with his conclusion: “Quantum computing is one of the most promising technological innovations likely to shape, streamline and optimize the future of the supply chain. It offers better insights to make better decisions. That’s why there’s so much excitement about it.” To learn more, read Stephen DeAngelis, “Google and the Quantum Time Crystal,” Enterra Insights, 13 August 2021. Dalvin Brown, “Google’s new ‘time crystals’ could be a breakthrough for long-awaited quantum computers,” The Washington Post, 12 August 2021. James Norman, “Quantum Computing Will Revolutionize Data Analysis. Maybe Soon,” Seeking Alpha, 14 March 2018. Eric Limer, “Why Programming a Quantum Computer Is So Damn Hard,” Gizmodo, 23 August 2013] Robert Liscouski, “Quantum Computing: A New Solution for Supply Chain and Logistics Optimization,” Material Handling & Logistics, 4 August 2021. Robert Liscouski, “How Quantum Computing Will Power the Future of Logistics,” SupplyChainBrain, 8 August 2021.
<urn:uuid:66b8cd37-7fa7-44d0-ab18-72320d947044>
CC-MAIN-2023-06
https://enterrasolutions.com/quantum-computing-and-the-future-of-logistics/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500334.35/warc/CC-MAIN-20230206082428-20230206112428-00190.warc.gz
en
0.923964
1,765
3.59375
4
Three scientists who have made seminal contributions to the experimental study of quantum entanglement and its applications share the Nobel Prize in Physics in 2022. Scientists John Clauser of the United States and Alain Aspect of France devised a method to definitively detect entanglement between photons. Quantum communication relies on entanglement, which was first successfully transmitted by Anton Zeilinger of the University of Vienna. The technologies of the future include quantum computing and quantum communication. Because they allow for rapid resolution of difficult problems and the use of “unbreakable” encrypted data. Particles like photons, ions, and atoms act under quantum physical phenomena like superposition and entanglement. Due to these occurrences, quantum computers can process vast amounts of data in a short amount of time, and quantum signals can be “teleported” almost instantly. The mystery of “spooky action at distance” Quantum entanglement has been described as “spooky action at a distance” by Albert Einstein and as the most crucial aspect of quantum physics by Erwin Schrödinger. Up until the measurement of the state of one of the entangled particles, the other remains in a superposition state, not knowing which of the two it is. Only then does the second one decide on its state simultaneously. All current quantum technologies are reliant on the observation of quantum entanglement. One analogy for quantum entanglement is that of two balls, one white and one black, whose superposition in midair renders them gray. The ultimate color of each ball is revealed only when one of them is captured. Simultaneously, it becomes obvious that the second ball is the opposite color. However, this raises the issue of how the balls determine which color they need to take on. Are their colors coincidental or do they potentially contain information that foretells the color they’ll show up in advance? Physicist John Stewart Bell suggested a theoretical potential in the 1960s for empirically clarifying this issue. According to this, a real entanglement without hidden variables would have to exhibit a specific degree of correlation when the measurements are repeated numerous times. But how to assess this in a realistic manner remained uncertain. John Clauser and Alain Aspect: The Bell test becomes practical The first prize winner of the 2022 Nobel Prize in Physics was the American physicist John Clauser for his work in this area. For the first time, he devised an experiment to prove that quantum entanglement is really possible and that Bell’s inequality could be broken. The scientist accomplished this by generating polarization-entangled pairs of photons. Clauser found out how frequently each combination happened by passing these photons through various polarization filters. As a result, it was clear that the entangled photons did disprove Bell’s inequality. There was no way to predict or account for the strength of the relationships. Instead, it was a “spooky action at distance” effect in which the measurement of one particle determines the state of another, nullifying the superposition. Clauser and his team’s experiment was exceedingly inefficient, however, since only a tiny percentage of the created photons were traceable through the filters and hence measurable. French physicist Alain Aspect, who came in second for the 2022 Physics Nobel Prize, decided to interfere here. He refined the experiment by separating the entangled photons and measuring them after they passed through two polarizers. Anton Zeilinger: Quantum teleportation and quantum amplification When sending optical information over long distances, for example via a fiber-optic cable, the light signal degrades, limiting the range; this is the issue that Anton Zeilinger of the University of Vienna addressed, and it is strongly connected to quantum entanglement. Over a distance of 6 miles (10 kilometers), about one photon is lost per second. Standard optical transmissions include intermediate amplifiers that account for this. Unfortunately, this cannot be done with entangled photons; the amplifier’s need to read out the signal before boosting it would destroy the quantum signal by canceling the entanglement. In 1998, Zeilinger and his group solved the problem using quantum teleportation. This stems from the discovery that one entangled pair of photons may impart that entanglement to another. As a result, all a quantum amplifier has to do to transport the entanglement and the quantum information it carries from one pair of photons to another is to guarantee that the two pairs make contact with each other under the correct conditions. This finding paves the way for the use of fiber optic cables to carry quantum communications across significant distances. Photons from the sun have also been “entangled” by scientists. Early adopters of quantum technology The three physicists who shared the 2022 Nobel Prize in Physics have thereby provided the groundwork for the eventual practicality of quantum technology. Their research on entangled states is groundbreaking. The Nobel Foundation explains that this is because “their results have cleared the way for new technology based upon quantum information.”
<urn:uuid:6ae5657a-bd22-490d-96d4-5bc2ff199929>
CC-MAIN-2023-06
https://malevus.com/2022-physics-nobel-prize/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500041.2/warc/CC-MAIN-20230202232251-20230203022251-00194.warc.gz
en
0.931918
1,030
3.6875
4
Quantum computing (QC) leverages quantum mechanics to enable a vastly different mode of computation than computers based on classical physics, including conventional von Neumann systems. A quantum bit (qubit), like a classical bit, takes a binary 0 or 1 value when measured, usually at the end of a quantum computation. However, the value of a qubit is not deterministic. A quantum state of n interacting qubits is parameterized by 2n complex numbers, which are called amplitudes and cannot be accessed directly; measuring such a state produces a single random n-bit classical string with probability dictated by a corresponding amplitude. A powerful feature of quantum computation is that manipulating n qubits allows users to sample from an exponentially larger probability distribution over 2n outcomes. However, an analogous claim can be made for randomized classical algorithms operating on n probabilistic bits (e.g., flipping n coins). A key difference between the two is that quantum algorithms seem to be able to sample from certain kinds of probability distributions that may take exponentially longer for randomized classical algorithms to mimic. For example, Shor’s seminal 25-year-old quantum algorithm for factoring integers requires exponentially fewer steps than the best-known classical counterparts. Exponential quantum advantages are also known for other fundamental scientific problems such as solving a certain kind of linear systems of equations and simulating quantum-mechanical systems, currently a critical bottleneck in many physical and chemical applications. The precise source of quantum computational advantage is not well understood; however, it is attributed in part to quantum computation’s ability to efficiently generate entanglement among qubits, yielding probability distributions with correlations that in some cases overstep the reach of efficient classical algorithms. Successes in designing theoretical quantum algorithms have fueled the hope that other quantum advantages can be discovered and exploited. Ideal quantum advantages would provide: (i) an exponential (or at least super-polynomial) computational speedup, (ii) practical applications, and (iii) implementation on a physically realizable quantum system (ideally scalable). A foremost open question in quantum computing is whether all three of these can be simultaneously achieved. A significant hurdle for (iii) is that prepared quantum states are fragile and highly susceptible to environmental noise and rapid entropic decay. Contemporary quantum information science (QIS) research addresses (i) and (ii) by developing novel quantum algorithms and applications and (iii) through scientific and engineering efforts to develop noise-resilient and scalable quantum infrastructure. After decades of steady progress, mainly in academia, the past five years have seen an explosion of interest and effort in QIS. The fifteen years of QC research at Sandia spans the Labs’ expertise from theoretical computer science and physics to microelectronic fabrication, laboratory demonstrations, and systems engineering. Hardware platforms developed at Sandia include a variety of efforts in trapped-ion, neutral atom, and semiconductor spin qubits. Complementary theoretical efforts have created unique capabilities, from quantum characterization verification and validation protocols to multi-scale qubit device modeling tools. Even efforts that are ostensibly purely theoretical, such as quantum algorithms development, are tied to applications of interest ranging from optimization and machine learning to materials simulation The breadth of current Sandia research activities coupled with the longevity of Sandia’s program have established Sandia as a leading U.S. National Laboratory in QC and broader QIS research. Most recently, Sandia has been successful in securing a number of quantum computing projects funded by the recent push from DOE Office of Science and the National Nuclear Security Administration. Among these projects, closest to the hardware, are the Advanced Scientific Computing Research (ASCR)-funded Quantum Scientific Open User Testbed (QSCOUT) and Quantum Performance Assessment (QPerformance) projects. In just over a year, the first edition of the QSCOUT testbed with three trapped-ion qubits was stood up. While this will be increased to thirty-two qubits in time, the testbed is most significant for providing researchers complete access to generation of the control signals that specify how gates are operated so they can further investigate the quantum computer itself. A critical component of this effort is the Sandia-developed Jaqal quantum assembly language which will be used to specify programs executed on QSCOUT. The QPerformance project is aimed at creating techniques for evaluating every aspect of a testbed QC’s performance and understanding and tracking how these change with improvements to the QC hardware and software. The effort isn’t limited to the QSCOUT testbed and it will invent and deploy platform-independent holistic benchmarks that will capture high-level characteristics that will be predictive in evaluating the suitability of QC platforms for DOE mission-relevant applications. At the next level of the computing hierarchy sits the ASCR-funded “Optimization, verification and engineered reliability of quantum computers” (OVER-QC). Led by Sandia, this project aims to develop tools that get the most out of near-term QC hardware, which will be noisy and imperfect. By developing specialized techniques to interpret the output, and to increase the reliability of such noisy hardware, OVER-QC aims to understand and push the limits of QC hardware. Sandia complements these efforts driven by near-term QC hardware with ASCR-funded efforts focusing on developing fundamental hardware-agnostic quantum algorithms for future fault-tolerant quantum computers. These Sandia-led projects, “Quantum Optimization and Learning and Simulation” (QOALAS) and “Fundamental Algorithmic Research for Quantum Computing” (FAR-QC), are multi-institutional interdisciplinary efforts leveraging world-class computer science, physics, and applied mathematics expertise at Sandia and more than ten partner institutions. QOALAS seeks to develop novel quantum algorithms enabling new applications in optimization, machine learning, and quantum simulation. FAR-QC expands upon the scope of QOALAS to identify problems and domains in which quantum resources may offer significant advantages over classical counterparts. Some of the achievements of these projects include new quantum algorithms offering significant advantages for solving linear systems, convex optimization, machine learning kernels, and rigorous simulation of physical systems. Among the key mission priorities of Sandia are those related to stockpile stewardship. The Advance Simulation and Computing (ASC)-funded Gate-Based Quantum Computing (GBQC) project is focused on understanding the prospects for QC platforms to eventually have significant impacts on the unique problems of stockpile stewardship. In this context, quantum simulation is a key capability. Sandia’s stockpile stewardship mission requires models for the behavior of materials in extreme conditions that are both challenging and expensive to evaluate experimentally. GBQC is focused on understanding what will be required to realize a simulation capability that would be exceptionally impactful to ASC and the broader DOE. Recent research directions have broadened the scope of this work to understand the impacts that QCs might have on numerical linear algebra, which is a key capability for not only ASC applications, but most computational science. Sandia has spent fifteen years developing a strong program in QIS and QC to better serve DOE and NNSA customers. As a result, Sandia is poised to be a leader in the fields of QIS and QC research, while integrating capabilities across the whole QC stack.
<urn:uuid:20b85f18-d0df-4ede-a64e-15cfc18c57d7>
CC-MAIN-2023-06
https://www.sandia.gov/news/publications/hpc-annual-reports/article/quantum-information/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764501066.53/warc/CC-MAIN-20230209014102-20230209044102-00594.warc.gz
en
0.914503
1,491
3.609375
4
As we look toward the next decade and ask ourselves “What is the future of education?”, we can only know one thing for sure… nothing is certain. However, it is clear that technology will continue to play a significant role in shaping the ways we learn and teach. One area that appears destined to see significant growth and evolution is that of virtual learning and online education. In recent years, there has been a notable increase in the number of educational institutions offering online courses, and this trend is expected to continue. With the widespread availability of high-speed internet and affordable computing devices, students now have greater access to educational content from any location… but this is just the start. So what does this mean for the future, and what core pillars of technology will play a role? Let’s start with virtual campuses — online learning environments that allow students and teachers to communicate and interact in a 3D space, in real-time, in a similar way to how they would in a physical environment. This is our core focus at Axon Park, to enable remote learning in collaborative, social, 3D worlds where students and educators can feel a shared sense of presence, no matter where they are in the physical world. One of the major advantages of virtual campuses is their flexibility. Students have the freedom to learn from anywhere, while institutions can significantly reduce their overhead. 3D virtual learning environments also offer significant engagement and interactivity benefits, which can arguably go above and beyond the physical world. For example, you could take a virtual field trip to the center of the earth, or to a subatomic scale. Try doing that in Kansas, Dorthy! Virtual classrooms are not just a convenient alternative to in-person learning; they have the potential to revolutionize the way we teach and learn. The use of immersive virtual reality (VR) technology has the potential to transform the way in which we experience and interact with educational content. Imagine being able to visit ancient Rome or explore the depths of the ocean, all from the comfort of your own home. In addition to providing immersive learning experiences, VR can also be used to simulate real-world scenarios and allow students to practice and apply their knowledge in a safe and controlled environment. For example, a student studying biology could use VR to dissect a virtual frog, or a student learning about engineering could use VR to design and build virtual rocket engines. This type of experiential learning can be especially beneficial for students who may not have access to real-world opportunities to practice and apply their knowledge. On top of this, the interactive nature of VR environments with head, hand, eye, and other types of tracking, provide significantly more learner interaction data than any prior digital platform. This allows for robust real-time analytics which can be used to support the learner and provide insights to educators. VR technology has the ability to make learning more immersive, engaging, and interactive, and it’s likely that we will see a greater incorporation of this technology into educational settings in the coming years. AI Personalized Learning On top of the advancements in virtual campus based learning and VR, it is also anticipated that there will be a greater focus on personalized learning in the future. This approach involves the use of artificial intelligence (AI) and other technologies to assess a student’s strengths and weaknesses, and provide customized learning experiences accordingly. This can be accomplished through the use of adaptive learning algorithms, which are able to analyze a student’s performance on quizzes, exams, and other assessments and adjust the content and difficulty level of future lessons accordingly. As we saw in the section above, the rich flow of data from interactive VR simulations can be a powerful feed for the AI. In addition to adapting content to a student’s individual abilities, AI can also be used to provide personalized feedback and support. For example, an AI tutor could analyze a student’s work and provide specific recommendations for improvement, or suggest additional resources for further study. This type of personalized support can be especially beneficial for students who may be struggling with certain concepts or who need extra help to keep up with their peers. When we look into the distant future, we should also consider how quantum computing has the potential to revolutionize the way we process and analyze information. By utilizing the principles of quantum mechanics, quantum computers are able to perform certain calculations much, much, faster than traditional binary computers. This has the potential to significantly impact the field of education, as it could allow for the development of new educational tools and technologies that are able to compute and analyze massive amounts of data in real-time. Quantum simulations may be able to replicate the workings of physical reality in ways that are impossible for the binary computers of today. For example, a quantum computer could be used to simulate protein folding, then provide scientists with extremely relevant and nuanced real-time feedback. Overall, while the potential applications of quantum computing in education are still largely unexplored, it is clear that this technology has the potential to significantly impact the way we learn and teach in the distant future. In conclusion, the future of education can’t be predicted with certainty. However, it is clear that technology will continue to have a major impact on the way in which we learn and teach. From virtual campuses and VR to AI personalized learning, the next 20 years are sure to bring exciting developments and innovations in the field of education.
<urn:uuid:2b458bbc-9e50-4660-bccc-0affbff436ee>
CC-MAIN-2023-06
https://axonpark.com/what-is-the-future-of-learning/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764500028.12/warc/CC-MAIN-20230202133541-20230202163541-00835.warc.gz
en
0.946755
1,114
3.5625
4
Two milliseconds – or two thousandths of a second – is an extraordinarily long time in the world of quantum computing. On these time scales, a blink of an eye – to a 10th of a second – is like an eternity. Now a team of researchers from UNSW Sydney have broken new ground by proving that ‘spin qubits’ – properties of electrons representing the basic units of information in quantum computers – can retain information for up to two milliseconds . Known as “coherence time,” the amount of time qubits can be manipulated into increasingly complicated calculations, the achievement is 100 times longer than previous benchmarks in the same quantum processor. “Longer coherence time means you have more time for your quantum information to be stored – which is exactly what you need when performing quantum operations,” says PhD student Ms Amanda Seedhouse, whose work in theoretical quantum computing have contributed to the realization. “Coherence time basically tells you how long you can perform all the operations in the algorithm or sequence you want to perform before you lose all of your qubit information.” In quantum computing, the longer you can keep spins moving, the better the chance that information can be retained during calculations. When the spin qubits stop spinning, the computation collapses and the values represented by each qubit are lost. The concept of coherence extension has already been confirmed experimentally by quantum engineers from UNSW in 2016. Making the task even more difficult is the fact that working quantum computers of the future will need to track the values of millions of qubits if they are to solve some of humanity’s greatest challenges, such as finding effective vaccines, modeling systems weather and predict the impacts of climate change. Late last year, the same team at UNSW Sydney solved a technical problem that had stuck engineers for decades on how to manipulate millions of qubits without generating more heat and interference. Rather than adding thousands of tiny antennas to control millions of electrons with magnetic waves, the research team found a way to use a single antenna to control all of the chip’s qubits by introducing a crystal called dielectric resonator. These results were published in Scientists progress. This solved the problem of space, heat, and noise that would inevitably increase as more and more qubits come online to perform the mind-bending calculations that are possible when qubits represent not just 1 or 0 like conventional binary computers, but both at the same time. , using a phenomenon known as quantum superposition. Global control vs individual control However, this proof-of-concept achievement still left some challenges to be resolved. Lead researcher Ms Ingvild Hansen joined Ms Seedhouse in addressing these questions in a series of journal articles Physical examination B, Physical examination A and Applied physics exams – the last article published this week. Being able to control millions of qubits with a single antenna was a big step forward. But while controlling millions of qubits at once is a great feat, working quantum computers will also need them to be manipulated individually. If all spin qubits spin at roughly the same frequency, they will have the same values. How can we control them individually so that they can represent different values in a calculation? “We first showed theoretically that we can improve coherence time by continuously spinning qubits,” says Hansen. “If you imagine a circus performer spinning plates, while they are still spinning, the show can go on. Similarly, if we continuously drive qubits, they can hold information longer. We have shown that such “dressed” qubits have coherence times of more than 230 microseconds [230 millionths of a second].” After the team showed that coherence times could be extended with so-called “dressed” qubits, the next challenge was to make the protocol more robust and show that globally controlled electrons can also be individually controlled so that they may contain different necessary values. for complex calculations. This was achieved by creating what the team dubbed the “SMART” qubit protocol – Sinusoidally Modulated, Always Rotating and Tailored. Rather than spinning the qubits in circles, they manipulated them to rock back and forth like a metronome. Then, if an electric field is individually applied to any qubit – bringing it out of resonance – it can be put into a different tempo from its neighbors, but still moving at the same rate. “Think of it like two kids on a seesaw moving forward and backward pretty much in sync,” says Ms Seedhouse. “If we nudge one of them, we can get them to reach the end of their arc at opposite ends, so one can be a 0 when the other is now a 1. .” The result is that not only can a qubit be controlled individually (electronically) under the influence of a global control (magnetically), but the coherence time is, as mentioned before, significantly longer and suitable for quantum calculations. “We have shown a simple and elegant way to control all qubits at once, which also comes with better performance,” says Dr. Henry Yang, one of the team’s lead researchers. “The SMART protocol will be a potential route for large-scale quantum computers.” The research team is led by Professor Andrew Dzurak, CEO and founder of Diraq, a UNSW spin-off company that develops quantum computing processors that can be fabricated using standard silicon chip fabrication. “Our next goal is to show it works with two-qubit computations after showing our proof of concept in our experimental paper with one qubit,” says Hansen. “After that, we want to show that we can do it for a handful of qubits as well, to show that the theory is proven in practice.” #long #time #Quantum #computing #engineers #setting #standard #silicon #chip #performance
<urn:uuid:802d5c61-f02c-428a-86a5-55e29288b3b0>
CC-MAIN-2023-06
https://pcaraci.com/2022/09/29/cbmigqfodhrwczovl25ld3nyb29tlnvuc3cuzwr1lmf1l25ld3mvc2npzw5jzs10zwnol2xvbmdlc3qtdgltzs1xdwfudhvtlwnvbxb1dgluzy1lbmdpbmvlcnmtc2v0lw5ldy1zdgfuzgfyzc1zawxpy29ulwnoaxatcgvyzm9ybwfuy2xsaqaoc5/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764499934.48/warc/CC-MAIN-20230201112816-20230201142816-00876.warc.gz
en
0.933014
1,244
3.75
4
The current artificial intelligence (AI) systems are regulated by other existing regulations such as data protection, consumer protection and market competition laws. It is critical for governments, leaders, and decision makers to develop a firm understanding of the fundamental differences between artificial intelligence, machine learning, and deep learning. Artificial intelligence (AI) applies to computing systems designed to perform tasks usually reserved for human intelligence using logic, if-then rules, and decision trees. AI recognizes patterns from vast amounts of quality data providing insights, predicting outcomes, and making complex decisions. Machine learning (ML) is a subset of AI that utilises advanced statistical techniques to enable computing systems to improve at tasks with experience over time. Chatbots like Amazon’s Alexa and Apple’s Siri improve every year thanks to constant use by consumers coupled with the machine learning that takes place in the background. Deep learning (DL) is a subset of machine learning that uses advanced algorithms to enable an AI system to train itself to perform tasks by exposing multilayered neural networks to vast amounts of data. It then uses what it learns to recognize new patterns contained in the data. Learning can be human-supervised learning, unsupervised learning, and/or reinforcement learning, like Google used with DeepMind to learn how to beat humans at the game Go. State of Artificial Intelligence in the Pandemic Era Artificial intelligence (AI) is stepping up in more concrete ways in blockchain, education, internet of things, quantum computing, arm race and vaccine development. During the Covid-19 pandemic, we have seen AI become increasingly pivotal to breakthroughs in everything from drug discovery to mission critical infrastructure like electricity grids. AI-first approaches have taken biology by storm with faster simulations of humans’ cellular machinery (proteins and RNA). This has the potential to transform drug discovery and healthcare. Transformers have emerged as a general purpose architecture for machine learning, beating the state of the art in many domains including natural language planning (NLP), computer vision, and even protein structure prediction. AI is now an actual arms race rather than a figurative one. Organizations must learn from the mistakes made with the internet, and prepare for a safer AI. Artificial intelligence deals with the area of developing computing systems which are capable of performing tasks that humans are very good at, for example recognising objects, recognising and making sense of speech, and decision making in a constrained environment. There are 3 stages of artificial intelligence: 1. Artificial Narrow Intelligence (ANI), which has a limited range of capabilities. As an example: AlphaGo, IBM’s Watson, virtual assistants like Siri, disease mapping and prediction tools, self-driving cars, machine learning models like recommendation systems and deep learning translation. 2. Artificial General Intelligence (AGI), which has attributes that are on par with human capabilities. This level hasn’t been achieved yet. 3. Artificial Super Intelligence (ASI), which has skills that surpass humans and can make them obsolete. This level hasn’t been achieved yet. Why Governments Need to Regulate Artificial Intelligence? We need to regulate artificial intelligence for two reasons. First, because governments and companies use AI to make decisions that can have a significant impact on our lives. For example, algorithms that calculate school performance can have a devastating effect. Second, because whenever someone takes a decision that affects us, they have to be accountable to us. Human rights law sets out minimum standards of treatment that everyone can expect. It gives everyone the right to a remedy where those standards are not met, and you suffer harm. Is There An International Artificial Intelligence Law? As of today, there is no international artificial intelligence law nor specific legislation designed to regulate its use. However, progress has been made as bills have been passed to regulate certain specific AI systems and frameworks. Artificial intelligence has changed rapidly over the last few decades. It has made our lives so much easier and saves us valuable time to complete other tasks. AI must be regulated to protect the positive progress of the technology. Legislators across the globe have to this day failed to design laws that specifically regulate the use of artificial intelligence. This allows profit-oriented companies to develop systems that may cause harm to individuals and to the broader society. National and International Artificial Intelligence Regulations National and local governments have started adopting strategies and working on new laws for a number of years, but no legislation has been passed yet. China for example has developed in 2017 a strategy to become the world’s leader in AI in 2030. In the US, the White House issued ten principles for the regulation of AI. They include the promotion of “reliable, robust and trustworthy AI applications”, public participation and scientific integrity. International bodies that give advice to governments, such as the OECD or the World Economic Forum, have developed ethical guidelines. The Council of Europe created a Committee dedicated to help develop a legal framework on AI. The most ambitious proposal yet comes from the EU. On 21 April 2021, the EU Commission put forward a proposal for a new AI Act. Ethical Concerns of Artificial Intelligence Police forces across the EU deploy facial recognition technologies and predictive policing systems. These systems are inevitably biased and thus perpetuate discrimination and inequality. Crime prediction and recidivism risk are a second AI application fraught with legal problems. A ProPublica investigation into an algorithm-based criminal risk assessment tool found the formula more likely to flag black defendants as future criminals, labelling them at twice the rate as white defendants, and white defendants were mislabeled as low-risk more often than black defendants. We need to think about the way we are mass producing decisions and processing people, particularly low income and low-status individuals, through automation and their consequences for society. How to Regulate Artificial Intelligence the Right Way An effective, rights-protecting AI regulation must, at a minimum, contain the following safeguards. First, artificial intelligence regulation must prohibit use cases, which violate fundamental rights, such as biometric mass surveillance or predictive policing systems. The prohibition should not contain exceptions that allow corporations or public authorities to use them “under certain conditions”. Second, there must be clear rules setting out exactly what organizations have to make public about their products and services. Companies must provide a detailed description of the AI system itself. This includes information on the data it uses, the development process, the systems’ purpose and where and by whom it is used. It is also key that individuals exposed to AI are informed about it, for example in the case of hiring algorithms. Systems that can have a significant impact on people’s lives should face extra scrutiny and feature in a publicly accessible database. This would make it easier for researchers and journalists to make sure companies and governments are protecting our freedoms properly. Third, individuals and organisations protecting consumers need to be able to hold governments and corporations responsible when there are problems. Existing rules on accountability must be adapted to recognise that decisions are made by an algorithm and not by the user. This could mean putting the company that developed the algorithm under an obligation to check the data with which algorithms are trained and the decisions algorithms make so they can correct problems. Fourth, new regulations must make sure that there is a regulator that can make companies and the authorities accountable and that they are following the rules properly. This watchdog should be independent and have the resources and powers it needs to do its job. Finally, AI regulation should also contain safeguards to protect the most vulnerable. It should set up a system that allows people who have been harmed by AI systems to make a complaint and get compensation. Workers should have the right to take action against invasive AI systems used by their employer without fear of retaliation. A trustworthy artificial intelligence should respect all applicable laws and regulations, as well as a series of requirements; specific assessment lists aim to help verify the application of each of the key requirements: Human agency and oversight: AI systems should enable equitable societies by supporting human agency and fundamental rights, and not decrease, limit or misguide human autonomy. Robustness and safety: Trustworthy AI requires algorithms to be secure, reliable and robust enough to deal with errors or inconsistencies during all life cycle phases of AI systems. Privacy and data governance: Citizens should have full control over their own data, while data concerning them will not be used to harm or discriminate against them. Transparency: The traceability of AI systems should be ensured. Diversity, non-discrimination and fairness: AI systems should consider the whole range of human abilities, skills and requirements, and ensure accessibility. Societal and environmental well-being: AI systems should be used to enhance positive social change and enhance sustainability and ecological responsibility. Accountability: Mechanisms should be put in place to ensure responsibility and accountability for AI systems and their outcomes.
<urn:uuid:3694556e-a642-4777-b72d-13b6895a3896>
CC-MAIN-2023-06
https://articles.entireweb.com/technology/state-of-ai-and-ethical-issues/
s3://commoncrawl/crawl-data/CC-MAIN-2023-06/segments/1674764495012.84/warc/CC-MAIN-20230127195946-20230127225946-00796.warc.gz
en
0.938329
1,834
3.71875
4
Quantum technologies are the way of the future, but will that future ever arrive? Maybe so. Physicists have cleared a bit more of the path to a plausible quantum future by constructing an elementary network for exchanging and storing quantum information. The network features two all-purpose nodes that can send, receive and store quantum information, linked by a fiber-optic cable that carries it from one node to another on a single photon. The network is only a prototype, but if it can be refined and scaled up, it could form the basis of communication channels for relaying quantum information. A group from the Max Planck Institute of Quantum Optics (M.P.Q.) in Garching, Germany, described the advance in the April 12 issue of Nature. (Scientific American is part of Nature Publishing Group.) Quantum bits, or qubits, are at the heart of quantum information technologies. An ordinary, classical bit in everyday electronics can store one of two values: a 0 or a 1. But thanks to the indeterminacy inherent to quantum mechanics, a qubit can be in a so-called superposition, hovering undecided between 0 and 1, which adds a layer of complexity to the information it carries. Quantum computers would boast capabilities beyond the reach of even the most powerful classical supercomputers, and cryptography protocols based on the exchange of qubits would be more secure than traditional encryption methods. Physicists have used all manner of quantum objects to store qubits—electrons, atomic nuclei, photons and so on. In the new demonstration, the qubit at each node of the network is stored in the internal quantum state of a single rubidium atom trapped in a reflective optical cavity. The atom can then transmit its stored information via an optical fiber by emitting a single photon, whose polarization state carries the mark of its parent atom's quantum state; conversely, the atom can absorb a photon from the fiber and take on the quantum state imprinted on that photon's polarization. Because each node can perform a variety of functions—sending, receiving or storing quantum information—a network based on atoms in optical cavities could be scaled up simply by connecting more all-purpose nodes. "We try to build a system where the network node is universal," says M.P.Q. physicist Stephan Ritter, one of the study's authors. "It's not only capable of sending or receiving—ideally, it would do all of the things you could imagine." The individual pieces of such a system had been demonstrated—atoms sending quantum information on single emitted photons, say—but now the technologies are sufficiently advanced that they can work as an ensemble. "This has now all come together and enabled us to realize this elementary version of a quantum network," Ritter says. Physicists proposed using optical cavities for quantum networks 15 years ago, because they marry the best features of atomic qubits and photonic qubits—namely that atoms stay put, making them an ideal storage medium, whereas photons are speedy, making them an ideal message carrier between stationary nodes. But getting the photons and atoms to communicate with one another has been a challenge. "If you want to use single atoms and single photons, as we do, they hardly interact," Ritter adds. That is where the optical cavity comes in. The mirrors of the cavity reflect a photon past the rubidium atom tens of thousands of times, boosting the chances of an interaction. "During this time, there's enough time to really do this information exchange in a reliable way," Ritter says. "The cavity enhances the coupling between the light field and the atom." The M.P.Q. group put their prototype network through a series of tests—transferring a qubit from a single photon to a single atom and reversing the process to transfer information from an atom onto a photon. Combining those read/write operations, the physicists managed to transmit a qubit from one rubidium atom to another located in a separate laboratory 21 meters away, using a messenger photon as the carrier between nodes. (The actual length of optical fiber connecting the two nodes is 60 meters, because it snakes along an indirect route.) A significant number of the photons get lost along the way, limiting the efficiency of the process. But in principle, optical fibers could connect nodes at greater distances. "We're absolutely not limited to these 21 meters," Ritter says. "This 21 meters is just the distance that we happened to have between the two labs." The researchers also demonstrated that their photonic link can be used to entangle the two distant atoms. Quantum entanglement is a phenomenon by which two particles share correlated properties—in other words, the quantum state of one particle depends on the state of its entangled partner. Manipulating one of the particles, then, affects the other particle's state, even if it is located in another laboratory. Researchers hope that entanglement can be harnessed to circumvent the photon losses that come from passage through optical fibers. In a proposed application called a quantum repeater, a series of nodes, linked by entanglement, would extend the quantum connection down the line without depending on any one photon as the carrier. Ritter acknowledges that the new work is simply a prototype, and one for which numerous improvements are possible. For instance, the transfer of a quantum state between labs succeeded only 0.2 percent of the time, owing to various inefficiencies and technical limitations. "Everything is at the edge of what can be done," he says. "All these characteristics are good enough to do what we've done, but there are clear strategies to pursue to make them even better."
<urn:uuid:82c4b284-d3c0-4145-bd89-d895e8085497>
CC-MAIN-2016-22
http://www.scientificamerican.com/article/universal-quantum-network/
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049277091.36/warc/CC-MAIN-20160524002117-00034-ip-10-185-217-139.ec2.internal.warc.gz
en
0.942877
1,158
3.796875
4
Condensed matter physics – the branch of physics responsible for discovering and describing most of these phases – has traditionally classified phases by the way their fundamental building blocks – usually atoms – are arranged. The key is something called symmetry. To understand symmetry, imagine flying through liquid water in an impossibly tiny ship: the atoms would swirl randomly around you and every direction – whether up, down, or sideways – would be the same. The technical term for this is "symmetry" – and liquids are highly symmetric. Crystal ice, another phase of water, is less symmetric. If you flew through ice in the same way, you would see the straight rows of crystalline structures passing as regularly as the girders of an unfinished skyscraper. Certain angles would give you different views. Certain paths would be blocked, others wide open. Ice has many symmetries – every "floor" and every "room" would look the same, for instance – but physicists would say that the high symmetry of liquid water is broken. Classifying the phases of matter by describing their symmetries and where and how those symmetries break is known as the Landau paradigm. More than simply a way of arranging the phases of matter into a chart, Landau’s theory is a powerful tool which both guides scientists in discovering new phases of matter and helps them grapple with the behaviours of the known phases. Physicists were so pleased with Landau’s theory that for a long time they believed that all phases of matter could be described by symmetries. That’s why it was such an eye-opening experience when they discovered a handful of phases that Landau couldn’t describe. Beginning in the 1980s, condensed matter researchers, including Xiao-Gang Wen – now a faculty member at Perimeter Institute – investigated new quantum systems where numerous ground states existed with the same symmetry. Wen pointed out that those new states contain a new kind of order: topological order. Topological order is a quantum mechanical phenomenon: it is not related to the symmetry of the ground state, but instead to the global properties of the ground state’s wave function. Therefore, it transcends the Landau paradigm, which is based on classical physics concepts. Topological order is a more general understanding of quantum phases and the transitions between them. In the new framework, the phases of matter were described not by the patterns of symmetry in the ground state, but by the patterns of a decidedly quantum property – entanglement. When two particles are entangled, certain measurements performed on one of them immediately affect the other, no matter how far apart the particles are. The patterns of such quantum effects, unlike the patterns of the atomic positions, could not be described by their symmetries. If you were to describe a city as a topologically ordered state from the cockpit of your impossibly tiny ship, you’d no longer be describing the girders and buildings of the crystals you passed, but rather invisible connections between them – rather like describing a city based on the information flow in its telephone system. This more general description of matter developed by Wen and collaborators was powerful – but there were still a few phases that didn’t fit. Specifically, there were a set of short-range entangled phases that did not break the symmetry, the so-called symmetry-protected topological phases. Examples of symmetry-protected phases include some topological superconductors and topological insulators, which are of widespread immediate interest because they show promise for use in the coming first generation of quantum electronics. In the paper featured in today’s issue of Science, Wen and collaborators reveal a new system which can, at last, successfully classify these symmetry-protected phases. Using modern mathematics – specifically group cohomology theory and group super-cohomology theory – the researchers have constructed and classified the symmetry-protected phases in any number of dimensions and for any symmetries. Their new classification system will provide insight about these quantum phases of matter, which may in turn increase our ability to design states of matter for use in superconductors or quantum computers. This paper is a revealing look at the intricate and fascinating world of quantum entanglement, and an important step toward a modern reclassification of all phases of matter. - Read the paper in Science - The current issue of Nature provides experimental confirmation of the existence of quantum spin liquids, one of the new states of matter that was theoretically predicted by Wen and collaborators - Wen’s essay on the connections between condensed matter physics and cosmology - An introduction to understanding phases of matter based on symmetry About Xiao-Gang Wen Regarded as one of the world’s leading condensed matter theorists, Xiao-Gang Wen holds the BMO Financial Group Isaac Newton Chair at Perimeter Institute for Theoretical Physics. The BMO/Newton Chair was established by a $4 million gift from the BMO Financial Group in 2010 and, in 2011, Wen joined Perimeter from MIT as its inaugural occupant. Read a lay-accessible overview of his research.
<urn:uuid:95a01af2-1b0d-4787-980b-c6eb74fd90d9>
CC-MAIN-2016-22
http://perimeterinstitute.ca/node/86118
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049277091.36/warc/CC-MAIN-20160524002117-00034-ip-10-185-217-139.ec2.internal.warc.gz
en
0.954172
1,042
3.671875
4
A Nov. 5, 2013 Vienna University of Technology press release (also available on EurekAlert) describes research that may make quantum optical switches possible, With just a single atom, light can be switched between two fibre optic cables at the Vienna University of Technology. Such a switch enables quantum phenomena to be used for information and communication technology. The press release goes on to describe a ‘light in a bottle’ technique which leads, the researchers hope, that they may have discovered how to create a quantum light switch, Professor Arno Rauschenbeutel and his team at the Vienna University of Technology capture light in so-called “bottle resonators”. At the surface of these bulgy glass objects, light runs in circles. If such a resonator is brought into the vicinity of a glass fibre which is carrying light, the two systems couple and light can cross over from the glass fibre into the bottle resonator. “When the circumference of the resonator matches the wavelength of the light, we can make one hundred percent of the light from the glass fibre go into the bottle resonator – and from there it can move on into a second glass fibre”, explains Arno Rauschenbeutel. A Rubidium Atom as a Light Switch This system, consisting of the incoming fibre, the resonator and the outgoing fibre, is extremely sensitive: “When we take a single Rubidium atom and bring it into contact with the resonator, the behaviour of the system can change dramatically”, says Rauschenbeutel. If the light is in resonance with the atom, it is even possible to keep all the light in the original glass fibre, and none of it transfers to the bottle resonator and the outgoing glass fibre. The atom thus acts as a switch which redirects light one or the other fibre. Both Settings at Once: The Quantum Switch In the next step, the scientists plan to make use of the fact that the Rubidium atom can occupy different quantum states, only one of which interacts with the resonator. If the atom occupies the non-interacting quantum state, the light behaves as if the atom was not there. Thus, depending on the quantum state of the atom, light is sent into either of the two glass fibres. This opens up the possibility to exploit some of the most remarkable properties of quantum mechanics: “In quantum physics, objects can occupy different states at the same time”, says Arno Rauschenbeutel. The atom can be prepared in such a way that it occupies both switch states at once. As a consequence, the states “light” and “no light” are simultaneously present in each of the two glass fibre cables. [emphasis mine] For the classical light switch at home, this would be plain impossible, but for a “quantum light switch”, occupying both states at once is not a problem. “It will be exciting to test, whether such superpositions are also possible with stronger light pulses. Somewhere we are bound to encounter a crossover between quantum physics and classical physics”, says Rauschenbeutel. This light switch is a very powerful new tool for quantum information and quantum communication. “We are planning to deterministically create quantum entanglement between light and matter”, says Arno Rauschenbeutel. “For that, we will no longer need any exotic machinery which is only found in laboratories. Instead, we can now do it with conventional glass fibre cables which are available everywhere.” Darrick Chang offers a good introduction (i.e., it’s challenging but you don’t need a physics degree to read it) and some analysis of this work in his Nov. 4, 2013 article for Physics (6, 121 (2013) DOI: 10.1103/Physics.6.121) titled: Viewpoint: A Single-Atom Optical Switch. Quantum scientists over the past two decades have dreamt of realizing powerful new information technologies that exploit the laws of quantum mechanics in their operation. While many approaches are being pursued, a prevailing choice consists of using single atoms and particles of light—single photons—as the fundamental building blocks of these technologies . In this paradigm, one envisions that single atoms naturally act as quantum processors that produce and interface with single photons, while the photons naturally act as wires to carry information between processors. Reporting in Physical Review Letters, researchers at the Vienna University of Technology, Austria, have taken an important step forward in this pursuit, by experimentally demonstrating a microphotonic optical switch that is regulated by just a single atom . This article is open access. For those willing to tackle a more challenging paper, here’s a link to and a citation for the Vienna University of Technology researchers’ paper, Fiber-Optical Switch Controlled by a Single Atom by Danny O’Shea, Christian Junge, Jürgen Volz, and Arno Rauschenbeute. Phys. Rev. Lett. 111, 193601 (2013) [5 pages] This work is behind a paywall. Minutes after publishing: here’s an image that illustrates superpositioning in a quantum switch,
<urn:uuid:0c683d91-ba92-4df4-a48d-92585e7aaa27>
CC-MAIN-2016-22
http://www.frogheart.ca/?tag=superpositions
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049277091.36/warc/CC-MAIN-20160524002117-00036-ip-10-185-217-139.ec2.internal.warc.gz
en
0.898674
1,095
3.578125
4
Nanoscale cavity strongly links quantum particles Scientists have created a crystal structure that boosts the interaction between tiny bursts of light and individual electrons, an advance that could be a significant step toward establishing quantum networks in the future. Today’s networks use electronic circuits to store information and optical fibers to carry it, and quantum networks may benefit from a similar framework. Such networks would transmit qubits – quantum versions of ordinary bits – from place to place and would offer unbreakable security for the transmitted information. But researchers must first develop ways for qubits that are better at storing information to interact with individual packets of light called photons that are better at transporting it, a task achieved in conventional networks by electro-optic modulators that use electronic signals to modulate properties of light. Now, researchers in the group of Edo Waks, a fellow at JQI and an Associate Professor in the Department of Electrical and Computer Engineering at the University of Maryland, have struck upon an interface between photons and single electrons that makes progress toward such a device. By pinning a photon and an electron together in a small space, the electron can quickly change the quantum properties of the photon and vice versa. The research was reported online Feb. 8 in the journal Nature Nanotechnology. “Our platform has two major advantages over previous work,” says Shuo Sun, a graduate student at JQI and the first author of the paper. “The first is that the electronic qubit is integrated on a chip, which makes the approach very scalable. The second is that the interactions between light and matter are fast. They happen in only a trillionth of a second – 1,000 times faster than previous studies.” CONSTRUCTING AN INTERFACE The new interface utilizes a well-studied structure known as a photonic crystal to guide and trap light. These crystals are built from microscopic assemblies of thin semiconductor layers and a grid of carefully drilled holes. By choosing the size and location of the holes, researchers can control the properties of the light traveling through the crystal, even creating a small cavity where photons can get trapped and bounce around. ”These photonic crystals can concentrate light in an extremely small volume, allowing devices to operate at the fundamental quantum limit where a single photon can make a big difference,” says Waks. The results also rely on previous studies of how small, engineered nanocrystals called quantum dots can manipulate light. These tiny regions behave as artificial atoms and can also trap electrons in a tight space. Prior work from the JQI group showed that quantum dots could alter the properties of many photons and rapidly switch the direction of a beam of light. The new experiment combines the light-trapping of photonic crystals with the electron-trapping of quantum dots. The group used a photonic crystal punctuated by holes just 72 nanometers wide, but left three holes undrilled in one region of the crystal. This created a defect in the regular grid of holes that acted like a cavity, and only those photons with only a certain energy could enter and leave. Inside this cavity, embedded in layers of semiconductors, a quantum dot held one electron. The spin of that electron – a quantum property of the particle that is analogous to the motion of a spinning top – controlled what happened to photons injected into the cavity by a laser. If the spin pointed up, a photon entered the cavity and left it unchanged. But when the spin pointed down, any photon that entered the cavity came out with a reversed polarization – the direction that light’s electric field points. The interaction worked the opposite way, too: A single photon prepared with a certain polarization could flip the electron’s spin. Both processes are examples of quantum switches, which modify the qubits stored by the electron and photon in a controlled way. Such switches will be the coin of the realm for proposed future quantum computers and quantum networks. Those networks could take advantage of the strengths that photons and electrons offer as qubits. In the future, for instance, electrons could be used to store and process quantum information at one location, while photons could shuttle that information between different parts of the network. Such links could enable the distribution of entanglement, the enigmatic connection that groups of distantly separated qubits can share. And that entanglement could enable other tasks, such as performing distributed quantum computations, teleporting qubits over great distances or establishing secret keys that two parties could use to communicate securely. Before that, though, Sun says that the light-matter interface that he and his colleagues have created must create entanglement between the electron and photon qubits, a process that will require more accurate measurements to definitively demonstrate. “The ultimate goal will be integrating photon creation and routing onto the chip itself,” Sun says. “In that manner we might be able to create more complicated quantum devices and quantum circuits.” In addition to Waks and Sun, the paper has two additional co-authors: Glenn Solomon, a JQI fellow, and Hyochul Kim, a post-doctoral researcher in the Department of Electrical and Computer Engineering at the University of Maryland. "Creating a quantum switch" credit: S. Kelley/JQI Subscribe to A Quantum Bit Quantum physics began with revolutionary discoveries in the early twentieth century and continues to be central in today’s physics research. Learn about quantum physics, bit by bit. From definitions to the latest research, this is your portal. Subscribe to receive regular emails from the quantum world. Previous Issues... Sign Up Now Sign up to receive A Quantum Bit in your email!
<urn:uuid:10637b5b-584b-47a8-bf77-11153d3c385a>
CC-MAIN-2016-22
http://jqi.umd.edu/news/nanoscale-cavity-strongly-links-quantum-particles
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049277091.36/warc/CC-MAIN-20160524002117-00036-ip-10-185-217-139.ec2.internal.warc.gz
en
0.928738
1,161
3.734375
4
Quantum computing is one of the current big things in both physics and computer science circles. But there is a serious divide between what we think might be possible and what we can, in fact, do. There are theorists out there working themselves into a frenzy, trying to show that quantum computing will make a smoother latte. On the experimental side, many researchers are still in various stages of single gate operations. It is like the difference between trying to make a valve and knowing what you can do with lots of valves once you have them. In a recent paper, published in Applied Physics Letters, researchers from the UK and Australia have demonstrated that quantum computing gates with very low error rates, based on integrated optical circuits, are now feasible. This might pave the way for multi-gate optical quantum computers. Quantum computing is, as the name might suggest, a merger between classical digital computers and the quantum freakiness that permeates the world around us at the smallest scales. In a classical computer, a bit can have two values: logic one and logic zero. When we perform operations on a string of bits, we either leave them unchanged or flip them, depending on some control bits. It is important to realize that the value of a bit at any particular time does not depend on any of its partner bits. If we add a dash of quantumness to the mix, we can do two things. First, logic elements, now qubits, are no longer logic one or logic zero; instead, they are both at the same time. When we read out the result from a program, we obtain a definite one or zero, but during the computation, the qubit really is in both states. Operations don't necessarily flip bits. Instead, they modify the probability of a measurement returning a one or a zero. The second element added to the mix is correlations between qubits. When we perform an operation on one qubit in a string of them, we are actually performing an operation on all the qubits. There are good and bad aspects to this. A quantum computer doesn't always return the right answer, but some operations, like factoring or database searches, can be sped up. Not returning the right answer comes from two factors. There is an intrinsic uncertainty associated with measurement—it's the price we pay for being in a quantum universe. There are also instrumental imperfections, which, at the moment, play a major role in limiting quantum computing. This is where Laing and colleagues come in. They focused on the construction of near perfect circuity. In the case of optical quantum computing logic, this corresponds to making perfect beam splitters and interferometers. These aren't the normal optics you might find in a microscope, which makes things both easier and more difficult. For instance, in a waveguide, a beam splitter is replaced by a directional coupler, where two waveguides are brought into close proximity. Over a certain length, light from one waveguide will leak into the adjacent waveguide. The amount of light that transfers depends on how close the two waveguides are and the distance they remain close. So, in principle, it is very easy to design a perfect beam splitter. In practice, fabrication uncertainty makes this a bit of a lottery—the usual procedure is to make quite a few, test them all, and pick the good one to report on. Interferometers are similar, in that they involve splitting and recombining light beams. However, in addition to requiring two perfect beam splitters for the interferometer, one also needs to carefully control how far the light must travel between the two. In other words, the fabrication tolerances on the two different light paths are quite tight. However, once you have these two elements, you can make a controlled NOT gate—a gate that inverts the quantum state of one qubit, depending on the state of the controlling qubit—which is a logic element from which all other logic elements can be constructed. That is exactly what this paper demonstrates. They show that they have very low loss waveguides, and that they can make beam splitters with a splitting ratio within a couple percent of their design ratio. To illustrate this, they showed data obtained from the quantum interference between single photons passing through their beam splitter. The error bars on the data are tiny, so within the uncertainty of their measurements, they have a perfect instrument. Likewise, Laing and colleagues show a controlled NOT gate that gets it right 97 percent of the time. "Right" being a relative thing here—this is the fidelity, which means it takes into account the fact that quantum measurements have a finite chance of getting the wrong answer irrespective of the quality of the equipment. From this, they calculate that, at worst, they have an error rate between one part in 100 and one part in 1000. The latter figure is probably good enough to start thinking about multiple gate operations. As you can see, I'm not reporting on anything startling here, just a good solid bit of technology that is necessary for optical quantum computers to do anything useful. I do wonder, however, how many of the circuit elements on the wafer were functional, because that is probably the limiting factor now. One thing missing in all optical implementations of quantum computers is programmability, because that involves switching light paths around. In integrated optic implementations, like this one, switches could be fast, and if the losses are low enough, programmability might well be on the horizon. The bigger problem on the horizon is multi-qubit calculations. To perform a calculation represented by a register of eight qubits, every one of those qubits has to be entangled with every other qubit, and that ain't easy. Applied Physics Letters, 2010, DOI: 10.1063/1.3497087
<urn:uuid:34028beb-8c0d-4fbc-aaae-f25d860e4c1d>
CC-MAIN-2016-22
http://arstechnica.com/science/2010/12/waveguides-make-quantum-computers-more-reliable/?comments=1&post=21123327
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464051165777.94/warc/CC-MAIN-20160524005245-00240-ip-10-185-217-139.ec2.internal.warc.gz
en
0.953076
1,192
3.796875
4
But it's a little more complex than this. We also have quantum mechanics to contend with. The spin of an electron is a vector. But we find that when we measure one of the components of this vector this value is quantised and can only take values +hbar/2 and -hbar/2, where hbar is Planck's constant. We choose units where h-bar is 1 so the z-component of the spin is always measured to be +1/2 or -1/2. If we write these two states as |+> and |-> then because we are dealing with quantum mechanics, the z-component of the spin can be represented by the linear combination a|+>+b|->. This corresponds to a state in which there is a probability |a|² of measuring +1/2 and a probability |b|² of measuring -1/2. This is what might have been written as a.*return (1/2)+b.*return (-1/2) in my earlier Haskell code. But that's just one component of the spin. What about the x- and y-components? Amazingly the state a|+>+b|-> tells us everything we can possibly know about the spin of an electron and we'll call it a spin state. Suppose we have an electron in the state ψ = a|+>+b|->. What happens if we measure the y-component of its spin? One way to answer that question is to rotate the electron through π/2 so that its x-axis is rotated to align with the z-axis and then measure the z-component of its spin. In order to to that we need to know how to rotate spin states. The rule for rotation through θ about the x-axis is this (in a suitable coordinate frame): |+> → cos(θ/2)|+>-sin(θ/2)|-> |-> → sin(θ/2)|+>+cos(θ/2)|-> Note how choosing θ=0 gives the identity, as expected. Note also that θ=π maps a|+>+b|-> to b|+>-a|-> so that the probabilities of measuring +1/2 and -1/2 are simply swapped, exactly what you'd expect for turning a state upside down. But there's something else that you should notice - there's an ambiguity. A rotation through 2π should give the same as a rotation through 0 and yet setting θ=2π in that transformation maps a state ψ to -ψ. Now |a|² = |-a|² so the probability of observing spin up or spin down is unaffected. But as I've been showing over previous posts, flipping a sign in a state can make a big difference as soon as you start performing interference experiments. The same goes for any angle: if I rotate through π should I use θ=π or θ = 3π? So can the transformation I've given make sense? The transformation does make sense if you consider that in any physical process that rotates an electron the transformation will evolve continuously over time. Electrons don't just instantly rotate. In other words, if a rotation is applied to an electron then it will follow a path in SO(3), not just be an instantaneous application of an element of SO(3). And that allows us to resolve the ambiguity: the rotations of electrons are described by the double cover of SO(3) known as SU(2). So a rotation through 360 degrees doesn't return you to the identity although a 720 degree rotation does. The transformation I gave above is completely unambiguous if you continuously rotate an electron around the x-axis tracking a continuous value of θ, after all, the double cover is basically just the set of continuous paths from the identitiy in SO(3) (with homotopic paths considered equivalent). And that's the bizarre fact: electron rotations aren't described by SO(3), they're described by SU(2). In particular, rotating an electron through 360 degrees does not return it to its original state, but a rotation through 720 degrees does! In a sense, like Dirac's belt, electrons can remember something about the path they took to get where they are, in particular they remember how many twists there were in the path. What does this mean experimentally? the first thing to note is that this is true not just for electrons but any spin-1/2 fermion. This included protons and neutrons. The stuff I've been talking about manifests itself in a number of ways. In particular, the spin of a particle affects how a magnetic field acts on it. For example, spin-up and spin-down particles can be separated into distinct beams using Stern-Gerlach apparatus. Also, the spin of particles precesses in a magnetic field and this is used on a regular basis in NMR. These two facts allow us to easily manipulate and measure the spin of fermions. In other words, the fact that fermions remember how many twists there are in their rotations isn't just some esoteric nonsense, it's now engineering and the theory is tested repeatedly all over the world. Every familiar object is invariant under rotations through 360 degrees. So the fact that electrons need to be rotated through 720 degrees to return them to their original state seems like one of the most bizarre facts about the universe I know of. And yet many books that introduce spin just slip in this fact in a routine way as if it were no different to any other. The fact that the biggest connected cover of SO(3) is the double cover puts a big constraint on the kinds of weird effects like this can happen. We can have a 360 degree rotation multiply by -1, but not by i, because a 720 degree rotation absolutely has to return us to where we started from. But suppose the universe were 2-dimensional. If you remember what I said about SO(2) you may notice that no such constraints apply because SO(2) has an infinite cover. There is a group in which all of the rotations through 360n degrees are distinct for distinct n. This means that a physical system could have its state multiplied by any factor (of modulus 1) when rotated through 360 degrees. Particle that behave this way are called anyons. But we live in a 3D universe so we don't expect any fundamental particles to have this property. However, in quantum mechanics any kind of 'excitation' of a physical system is quantised and can be thought of as a type of particle. These are known as quasiparticles. For example, just as light is made of photons, sound is also quantised as phonons. In the right kind of solid state medium, especially those that arise from some kind of 2D lattice, it seems quite plausible that anyons might arise. This gives rise to the so called fractional quantum hall effect. Anyons might one day play an important role in quantum computing via topological quantum computation.
<urn:uuid:2e9391e8-b1b4-4581-b1a7-778b9a5b3d4e>
CC-MAIN-2016-22
http://blog.sigfpe.com/2007/04/curious-rotational-memory-of-electron.html?showComment=1176606420000
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049276543.81/warc/CC-MAIN-20160524002116-00000-ip-10-185-217-139.ec2.internal.warc.gz
en
0.955202
1,475
4.09375
4
Got mass? Princeton scientists observe electrons become both heavy and speedy Posted June 13, 2012; 02:00 p.m. A Princeton University-led team of scientists has shown how electrons moving in certain solids can behave as though they are a thousand times more massive than free electrons, yet at the same time act as speedy superconductors. The observation of these seemingly contradictory electron properties is critical to the understanding of how certain materials become superconducting, in which electrons can flow without resistance. Such materials could dramatically increase the efficiency of electrical power networks and speed up computers. This video displays heavy electrons at different energies and shows their standing wave patterns (like water in a pond) around individual atomic defects placed intentionally in a compound. The patterns in these images allowed the Princeton scientists to understand the formation of heavy electron waves and to identify a hard-to-measure quantum entanglement process that controls their mass. (Video by the Yazdani Group) The concept of "heavy" electrons seems counterintuitive. The tiny particles flit through silicon chips to process information rapidly in digital electronics, and they flow with ease through copper wires carrying electricity to your desk lamp. But the Princeton research has revealed that a hard-to-measure process known as quantum entanglement determines the mass of electrons moving in a crystal and the delicate tuning of this entanglement can strongly alter the properties of a material. Cool the electrons to far below room temperature in certain types of solid materials, and these flighty particles gain mass, acting like much heavier particles. Surprisingly, further cooling close to absolute zero makes these solids become superconducting, where the electrons, despite their heaviness, make a kind of perfect fluid that can flow without wasting any electrical power. Electrons moving in certain solids can behave as if they are a thousand times more massive than free electrons, but at the same time act as superconductors. A new study led by Princeton scientists shows that this happens because of a process known as quantum entanglement that determines the mass of electrons moving in a crystal. The discovery can help improve understanding of how certain materials become superconducting, which may have applications in areas such as power network efficiency and computing speed. (Image by the Yazdani Group) In a study to appear in the June 14 issue of the journal Nature, the Princeton-led team, which included scientists from Los Alamos National Laboratory (LANL) and the University of California-Irvine, used direct imaging of electron waves in a crystal. The researchers did so not only to watch the electrons gain mass but also to show that the heavy electrons are actually composite objects made of two entangled forms of the electron. This entanglement arises from the rules of quantum mechanics, which govern how very small particles behave and allow entangled particles to behave differently than untangled ones. Combining experiments and theoretical modeling, the study is the first to show how the heavy electrons emerge from such entanglement. Observations made over the last 30 years indicate that electrons in certain solids behave as particles with masses hundreds to thousands of times larger than that of electrons moving freely in a vacuum. Until now, however, researchers had been unable to understand how this happens and lacked the tools to explore the connection between this process and the superconductivity of heavy electrons. The published study comes after several years of setting up the precise experimental conditions needed to visualize these heavy electrons. The team employed a custom-designed cryogenic scanning tunneling microscope (STM), which allows visualization of electron waves in a crystal. The researchers used the STM to look at crystals prepared in such a way that their surfaces contained a few atomic imperfections. As they lowered the temperature in the experiment, the researchers saw the emergence of patterns of electron waves spread around the defects in a way similar to how ripples of water form around rocks in a pond. (See video.) "It is remarkable to watch electrons moving in a crystal evolve into more massive particles as we cool them down," said Ali Yazdani, a professor of physics at Princeton and head of the team that conducted the study. Making this groundbreaking observation of electrons as they transition from light to heavy particles is only part of the story. The researchers also showed how the process can be understood based on quantum theories of electron behavior. Subatomic particles such as electrons can exhibit strange behavior because of quantum entanglement, which can mix diametrically opposite behaviors together. By comparing the data with theoretical calculations, the study shows that heavy electrons emerge from entanglement of two opposite behaviors of electrons, one in which they are localized around individual atoms and the other in which they are hopping freely from atom to atom in the crystal. "This is the first time we have a precise picture of formation of heavy electrons, thanks to our ability to probe them with high resolution," Yazdani said. The degree of such entanglement appears to be the key to understanding what the heavy electrons do once they are formed and cooled even further. Adjusting the crystal composition or structure can tune the degree of entanglement and the heaviness of electrons. Make the electrons too heavy and they freeze into a magnetized state, stuck at each atom in the crystal while spinning in unison. But tweaking the crystal composition so that the electrons have just the right amount of entanglement turns these heavy electrons into superconductors when they are cooled. "What is neat, and our studies confirm this, is that you really need to be on the verge of these two kinds of behaviors — sluggish and speedy — to get superconductivity," Yazdani said. "That is the circumstance most favorable to occurrence of heavy electron superconductivity." Understanding superconducting behavior of exotic electrons is at the forefront of research in physics, where there are many examples of magnetic materials that turn superconducting with subtle changes in their composition or crystal structure. The experiments may help physicists unravel the mysteries of high-temperature superconductivity, said Subir Sachdev, a theoretical physicist at Harvard University who was not involved with the work. Many physicists have argued that understanding this transition between magnetism and superconductivity, known as a quantum critical point, could help explain why the materials are superconducting. But physicists have lacked experimental evidence to prove their ideas. "We have been waiting for observations like this for many years, so it is very exciting that such a beautiful experimental system has been found and characterized so well," Sachdev said. The research was primarily supported by the U.S. Department of Energy's Basic Energy Sciences program. Additional support came from the National Science Foundation's Materials Research Science and Engineering Center program through the Princeton Center for Complex Materials; the W.M. Keck Foundation; and the Eric and Wendy Schmidt Transformative Technology Fund at Princeton. In addition to Yazdani, Princeton scientists on the team included postdoctoral scientist Pegor Aynajian and graduate students Eduardo da Silva Neto and András Gyenis. The team also included Ryan Baumbach, Joseph Thompson and Eric Bauer from LANL and Zachary Fisk from UC Irvine.
<urn:uuid:51e52734-4609-45fe-b971-7b0c8a3d2acb>
CC-MAIN-2016-22
http://www.princeton.edu/main/news/archive/S33/94/41S36/index.xml?section=topstories&path=/main/news/archive/S33/94/41S36/index.xml&prev=1
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049276780.5/warc/CC-MAIN-20160524002116-00022-ip-10-185-217-139.ec2.internal.warc.gz
en
0.938719
1,456
3.53125
4
Bose-Einstein condensation in the solid state New experimental research shows that half-matter, half-light quasi-particles called polaritons show compelling evidence of Bose-Einstein condensation at the relatively high temperature of 19 degrees Kelvin. The creation of a polariton Bose-Einstein condensate in the solid state provides scientists with a unique opportunity to better understand and possibly exploit the quantum effects that occur in these very special conditions. Researchers at EPFL (Ecole Polytechnique Federale de Lausanne), collaborating with colleagues at University of Grenoble, Cambridge, Oxford and MIT, have reported the observation of polaritons displaying the defining features of Bose Einstein condensation --a macroscopically ordered state, long-range spatial coherence and polarization – for the first time in solid state. Their results appear in an article in the September 28 issue of the journal Nature. Bose-Einstein condensates are sometimes referred to as a "fifth state of matter", a special phase in which all the particles share the same quantum state. This phase was predicted by Satyendranath Bose and Albert Einstein in 1924. Getting atoms cold enough to provide experimental proof of its existence took seventy more years, and the first successful experiments using Rubidium atoms won Eric Cornell, Wolfgang Ketterle and Carl Wieman the 2001 Nobel prize in physics. Cooled to within a hair of absolute zero, the atoms in dilute clouds of bosonic gases stop moving and condense, not into a liquid, but into a new phase called a condensate, in which the atoms all share the same quantum state. Like photons in a laser, the particles are coherent, behaving en masse like a "super-particle." The possibility of a phase change into a Bose-Einstein-like condensate theoretically applies for all bosonic particles, including electron-hole pairs called excitons and half exciton, half photon quasi-particles called polaritons. Exploring Bose-Einstein condensation and its intriguing quantum effects using these quasi-particles is particularly interesting because their light mass makes things much easier. A polariton is a billion times lighter than a Rubidium atom, and 10,000 times lighter than an electron. This means that polaritons can transform into a Bose-Einstein condensate at a much higher temperature than alkali gases. Some of the possibilities that have been suggested for applications of the quantum effects of the Bose-Einstein phase -- quantum computing, quantum clocks or atomic or lasers that use matter instead of light – are only realistically conceivable if these condensates can be achieved at room temperature, or at least temperatures that can be reached using standard cryogenic techniques. Signatures of exciton and polariton coherence have been previously observed in semiconductor microcavities, but conclusive proof, such as evidence of polarization and long range particle coherence, has remained elusive because the particles only live a trillionth of a second. The experiments of the EPFL-led team provide the first convincing evidence of a Bose-Einstein like condensate in the solid state. The researchers confined photons in a semiconductor microcavity containing a large number of quantum wells, and then used a laser to excite the semiconductor, generating polaritons. At a critical density, at the easily attainable temperature of 19 degrees Kelvin (about -254 Celsius), the polaritons showed evidence of spontaneous coalescence into a single coherent ground state. The classic phase transition characteristics -- macroscopic polarization and spatial coherence across the entire condensate -- are clearly seen here, and for the first time in solid state. According to Professor Benoit Deveaud, leader of the research team, condensates at even higher temperatures could perhaps be achieved using other semiconductor materials. "The magical properties of superfluidity, where matter flows with zero friction, and superconductivity, where a current flows with zero resistance, are quantum effects, and in the Bose-Einstein condensate they are directly brought to our perception," notes Deveaud. "It is exciting to envision exploring this magic without having to use an incredibly complex machine to get to temperatures just above absolute zero." What practical applications will this lead to? "We are still exploring the basic physics of this phenomenon," says Deveaud. "But just achieving this phase in the solid state is exciting. In the mid 1900s, transistors replaced vacuum lamps, and now most useful devices are made in the solid state," he explains. "Polaritons, although made with a photon, are really quasi-particles in the solid. It is likely that they can be manipulated much as electrons are – an advance that has led to incredible new technologies such as the CCD chips in digital cameras." Last reviewed: By John M. Grohol, Psy.D. on 30 Apr 2016 Published on PsychCentral.com. All rights reserved.
<urn:uuid:7800ca87-58c3-4230-b346-7a355e86d7df>
CC-MAIN-2016-22
http://psychcentral.com/news/archives/2006-09/epfd-bci092506.html
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464051177779.21/warc/CC-MAIN-20160524005257-00000-ip-10-185-217-139.ec2.internal.warc.gz
en
0.907437
1,028
3.5625
4
Another two mind-bending, paradigm-shattering findings in the new physics are known as “Non-Locality” and “Quantum Entanglement.” In classical physics, objects were seen as localized and isolated from one another within space; through dozens of replicated and verified experiments we now know, however, that the universe at the quantum level is entangled, non-local, One integrated whole. “Quantum physicists discovered a strange property in the subatomic world called ‘nonlocality’. This refers to the ability of a quantum entity such as an individual electron to influence another quantum particle instantaneously over any distance despite there being no exchange of force or energy. It suggests that quantum particles once in contact retain a connection even when separated, so that the actions of one will always influence the other, no matter how far they get separated.” -Lynne McTaggart, “The Field: The Quest for the Secret Force of the Universe,” (11) Before the advent of quantum physics, Albert Einstein, still thinking in the classical paradigm, thought that nothing in the universe could travel faster than light. In the past two decades, however, it has been experimentally proven that one thing can indeed move faster than the speed of light: information. Information can be sent between two objects at any distance instantaneously. “In 1997, scientific journals throughout the world published the results of something that traditional physicists say shouldn’t have happened. Reported to over 3,400 journalists, educators, scientists, and engineers in more than 40 countries, an experiment had been performed by the University of Geneva in Switzerland on the stuff that our world is made of – particles of light called photons – with results that continue to shake the foundation of conventional wisdom.” -Gregg Braden, “The Divine Matrix” (30) This ground-breaking experiment conclusively proved the existence of “Quantum Entanglement” which is basically a fancy name for “instantaneous information travel.” First scientists took single photons and split them into separate “twin” particles with identical properties. Then they fired both particles away from each other in opposite directions through specially designed fiber-optic chambers. At the end of these long pathways, the twin particles were forced to choose between two random but exactly identical routes. Curiously, without fail, in every trial the particles made precisely the same choices and traveled the same paths. Classical physics has always assumed that separate particles have no communication with one another, but quantum physics has now proven that assumption erroneous. The first entanglement experiments were designed and tested in 1982 by French physicist Alain Aspect at Orsay’s Institut d’Optique. These crude but conclusive studies later inspired Nicholas Gisin’s University of Geneva group of physicists to replicate them at greater distances. In 1997 Gisin built a 14 mile fiber-optic chamber and repeated Aspect’s experiment with exactly the same results. Later in 2004 Gisin extended the chamber to 25 miles and once again, as usual, no matter how far apart, the particles always chose and traveled the same random pathways. “Quantum mechanics has shown through experimentation that particles, being after all but moving points on some infinite wave, are in communication with one another at all times. That is to say, if our quantum mechanic does something to particle A over in Cincinnati, Ohio, planet Earth, the experience of this event will be instantly communicated to particle Z, at speeds faster than light, over in Zeta Reticuli. What this suggests is that anything one given particle experiences can be experienced by another particle simultaneously, and perhaps even by all particles everywhere. The reason for this is that they are all part of the same wave, the same energy flow.” –Jake Horsley, “Matrix Warrior” (90-91) “For a message to travel between them, it would have to be moving faster than the speed of light. But according the Einstein’s theory of relativity, nothing can travel that quickly. So is it possible that these particles are violating the laws of physics … or are they demonstrating something else to us? Could they be showing us something so foreign to the way we think about our world that we’re still trying to force the mystery of what we see into the comfortable familiarity of how we believe energy gets from one place to another? What if the signal from one photon never traveled to reach the other? Is it possible that we live in a universe where the information between photons, the prayer for our loved ones, or the desire for peace in a place halfway around the world never needs to be transported anywhere to be received? The answer is yes! This appears to be precisely the kind of universe we live in.” -Gregg Braden, “The Divine Matrix” (105-6) In it they state, “All particles in the history of the cosmos have interacted with other particles in the manner revealed by the Aspect experiments … Also consider … that quantum entanglement grows exponentially with the number of particles involved in the original quantum state and that there is no theoretical limit on the number of these entangled particles. If this is the case, the universe on a very basic level could be a vast web of particles, which remain in contact with one another over any distance in ‘no time’ in the absence of the transfer of energy or information. This suggests, however strange or bizarre it might seem, that all of physical reality is a single quantum system that responds together to further interactions.” The fact is quanta can exchange information over any distance in the universe instantaneously. These entanglement experiments prove that Eintstein was incorrect in stating that nothing travels faster than light (186,000 miles per second). Quantum information “travels” at infinite speed “arriving” at its destination without any time elapsing. Here we see how the Newtonian/Einsteinian language of a local universe fails to describe our actual reality. It’s not that information is “traveling” at infinite “speed” to “arrive” at another location, but rather that the universe with all its so-called parts and particles is actually One non-local quantum system. Information from one particle to another doesn’t need to “travel” there because the space between them is illusory, as is the language of calling them “separate” particles. As we have seen, before observation quanta are not particles with definite attributes and location; they are merely waves in the One universal quantum ocean until our conscious observation individualizes the wave into droplets of experience. “Nonlocality shatters the very foundations of physics. Matter can no longer be considered separate. Actions do not have to have an observable cause over an observable space. Einstein’s most fundamental axiom isn’t correct: at a certain level of matter, things can travel faster than the speed of light. Subatomic particles have no meaning in isolation but can only be understood in their relationships. The world, at its most basic, exists as a complex web of interdependent relationships, forever indivisible.” -Lynne McTaggart, “The Field: The Quest for the Secret Force of the Universe,” (11) “As an aside, it’s interesting to note that Nadeau and Kafatos mention early in their book that readers accidentally encountering their book in the ‘new age’ section of a bookstore would likely be disappointed. That’s because the book is about physics and not new age ideas. But the fact that Nadeau and Kafatos felt it important to mention this at all illustrates the rising tension between the leading edge of interpretations in physics and the tail end of metaphysics. Physicists interested in quantum ontology are painfully aware that some interpretations of quantum reality are uncomfortably close to mystical concepts. In the eyes of mainstream science, to express sympathy for mysticism destroys one’s credibility as a scientist. Thus the taboo persists.” -Dean Radin, “Entangled Minds” (262) Download the Spiritual Science 284-page E-book
<urn:uuid:3b672c21-24c5-4a94-9b76-d27b478d8318>
CC-MAIN-2016-22
http://www.atlanteanconspiracy.com/2012/10/nonlocality-and-quantum-entanglement.html
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049274985.2/warc/CC-MAIN-20160524002114-00150-ip-10-185-217-139.ec2.internal.warc.gz
en
0.938371
1,719
3.59375
4
Optogenetics: Helping Blind Mice See The Light Last week, researchers formally announced in Molecular Therapy that they had at last found a way to make blind mice see — a true glimmer of hope for the 15+ million worldwide who lose their site to genetic or age-related macular degeneration and retinitis pigmentation. The work comes from a collaboration of labs out of California, Florida and MIT — and it all starts with, yes, algae. Now here’s how it works: 1) Certain types of algae possess proteins called channelrhodopsins that respond to light by firing up activity in the cells that host them. The genes for these proteins can be pulled out, cleaned up and inserted into completely different types of cells like neurons and retinas where they’ll act in exactly the same way: light goes on, cell goes on. 2) In this study, the researchers isolated a gene for channelrhodopsin-2 (ChR2) and piggybacked it — by way of viral vehicle — into the degenerated retinas of mice bred for adult blindness. 3) Once in, the genes slipped into the remaining layer of retinal cells and transformed them into working, light-sensitive substitutes for photoreceptors, which is the type of cell typically lost in adult onset blindness. 10 weeks later, treated mice were successfully swimming through illuminated water mazes almost as well as their naturally sighted cousins and far, far better than their untreated, blind counterparts. To be sure, it’s doubtful they’re seeing 20/20 color vision — a substitute photoreceptor still isn’t the real thing — and in fact, this early on, researchers can’t know exactly how well the mice see, only that they do. But the fact remains: they can see. This alone is mind-boggling, but in fact, it’s only the latest breakthrough in the field of optogenetics, a study in which cells and neurons can be quite literally flipped on or off with a flash of light thanks to the embedded genes within. The field was co-invented in 2004 by the MIT Media Lab’s Ed Boyden, then a Ph.D. candidate at Stanford University, in collaboration with Georg Nagel at the University of Wurzburg and Karl Deisseroth, then also of Stanford. Since then, it has leapt from a single lab bench to over 1,000 research groups across the world; potential applications extend far beyond blindness to encompass Parkinson’s, PTSD, addiction, mood disorders, and neuron-by-neuron mapping of the entire brain from the inside out. Late last year, Nature Methods awarded it the Method of the Year. These days, though, blindness research comprises only a small portion of Boyden’s projects. A methods man at heart, his primary focus is on perfecting his technology and finding better ways to understand exactly how the brain itself actually works. On the eve of this latest report, I stopped by the Media Lab to speak with this man who, late one August night in 2004, all but revolutionized the field of neuroscience. Here’s what he had to say: So this field has absolutely exploded since you kicked it off in 2005. What’s next on the horizon? Well, so there are three main things that are important right now. One, of course, is to make more powerful tools — though eventually, those will get as far as they can. Another one of the big things that we’re still working on is mining genomes throughout the tree of life to find new genes that are higher performance, faster, with better light sensitivity, with higher magnitudes of currents, respond to different colors, and so on. For example, last year we had the first paper to report multi-color silencing [Editor’s Note: By implanting a different gene in a cell, yellow light will cause it to turn off]. There are definitely still new things to come up with, but that said, we’re also always looking for more technologies to come up with as well. In my lab, only about a third of the group works on the molecular perturbation sort of stuff. And what are the other two thirds working on? Well, one of the big issues we’re working on is: how do we confront the complexity of the entire brain? So we’ve started to devise structures that allow us to perturb and record from sites throughout the brain. One of ideas we like to use to frame this whole endeavor is what we like to call brain coprocessors, basically using very fine probes to record data from throughout the brain, mine that data for information on the computations that are occurring in the brain, that can then be used to test theories of the brain. We also have an army grant to collaborate with Ki Goosens [at the MIT McGovern Institute], for which we’re going to try to figure out whether there are any sites in the brain where you can erase PTSD. So you started out an electrical engineer and a physicist working on quantum computing. Now you’re in the middle of the brain and the co-founder of an entire field of neuroscience. What happened? All through my undergrad work [at MIT] and when I started grad school [at Stanford], I was doing this quantum computer, and I really had two themes. One is, how do you control complex systems, and the second is how do we get at the essence of computation. For example, I wrote a control system for an autonomous submarine so that it could navigate underwater – actually, we won the Navy’s first international autonomous underwater vehicle competition with that – then I also wrote an animation for video games based on the laws of physics. So I’m very obsessed with controlling things, because that’s really what gives you a deep understanding of how things work — and it allows you to make stuff: you can make this submarine move underwater, or make this animation move realistically, and so on. So I was really into control theory and controlling physical systems, and it all came to a head around the fall of 1998 when Motorola gave my undergraduate and masters lab $5 million. My then-PI said, okay, I’ll pay for anybody to go wherever you want for a month to learn something new. I went to Bell Labs, which at the time, was the place to go, and it was fantastic. I was only there for a matter of weeks literally, but I came out with three novel things and it was just like, wow. In contrast to physics where we often just felt like we were checking Einstein for the 800th time and he was still correct for the most part. From there, I went to Stanford to study in Dick Tsein’s group – he was also an electrical engineer who switched into biology, and it was in his lab that I and Karl Deisseroth, the co-inventor who was also a student there, started doing the very first studies. But why even head into the brain in the first place? I’m interested in the brain for two reasons, one is a philosophical question: How do we think? The second is pragmatic: The disease burden of the brain is huge, yet a lot of people have given up on figuring out how to treat them. The Wall Street Journal had an article a few weeks ago pointing out how the pharmaceutical companies — GlaxoSmithKline, AstraZeneca, and so on — have more or less given up on the vast majority of brain disorders, and that’s kind of worrisome. I mean, something like a billion people worldwide have some kind of brain disorder, and if you look at the disorder, most of them have very little treatment at all and for the ones that do have treatment, it’s not a cure if it usually has side effects. So how I think of it is: If the pharma industry is giving up on these, then that means that we have a duty to go after them and start working on them. What was the transition like, from tidy engineering to messy biology? I spent a full year just sort of getting used to that. There was a lot of floundering around. Actually, I just came up with this analogy that I think finally captures it: I was listening to This American Life the other day, and poker was the theme for that week. In the opening spiel, they talked about this poker player who won a lot of money by breaking all the rules accidentally and at the end of it, he notes that the thing he hates about poker is that you can play all of your cards optimally, and you still might lose because of chance. I feel like neurotechnology is the same way. In some ways, it’s the highest form of gambling because you can have just an amazing technology, and then some weird thing about the brain will come back and bite you and it won’t work. There is a lot to wrestle with at this level.
<urn:uuid:4a391745-2628-4476-97ce-df2b59c75b45>
CC-MAIN-2016-22
http://www.bostonmagazine.com/news/blog/2011/04/26/optogenetics-or-making-blind-mice-see-the-light/
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049276780.5/warc/CC-MAIN-20160524002116-00030-ip-10-185-217-139.ec2.internal.warc.gz
en
0.967302
1,891
3.59375
4
When scientists develop a full quantum computer, the world of computing will undergo a revolution of sophistication, speed and energy efficiency that will make even our beefiest conventional machines seem like Stone Age clunkers by comparison. But, before that happens, quantum physicists like the ones in UC Santa Barbara’s physics professor John Martinis’ lab will have to create circuitry that takes advantage of the marvelous computing prowess promised by the quantum bit (“qubit”), while compensating for its high vulnerability to environmentally-induced error. In what they are calling a major milestone, the researchers in the Martinis Lab have developed quantum circuitry that self-checks for errors and suppresses them, preserving the qubits’ state(s) and imbuing the system with the highly sought-after reliability that will prove foundational for the building of large-scale superconducting quantum computers. It turns out keeping qubits error-free, or stable enough to reproduce the same result time and time again, is one of the major hurdles scientists on the forefront of quantum computing face. “One of the biggest challenges in quantum computing is that qubits are inherently faulty,” said Julian Kelly, graduate student researcher and co-lead author of a research paper that was published in the journal Nature. “So if you store some information in them, they’ll forget it.” Unlike classical computing, in which the computer bits exist on one of two binary (“yes/no”, or “true/false”) positions, qubits can exist at any and all positions simultaneously, in various dimensions. It is this property, called “superpositioning,” that gives quantum computers their phenomenal computational power, but it is also this characteristic which makes qubits prone to “flipping,” especially when in unstable environments, and thus difficult to work with. “It’s hard to process information if it disappears,” said Kelly. However, that obstacle may just have been cleared by Kelly, postdoctoral researcher Rami Barends, staff scientist Austin Fowler and others in the Martinis Group. The error detection process involves creating a scheme in which several qubits work together to preserve the information, said Kelly. To do this, information is stored across several qubits. “And the idea is that we build this system of nine qubits, which can then look for errors,” he said. Qubits in the grid are responsible for safeguarding the information contained in their neighbors, he explained, in a repetitive error detection and correction system that can protect the appropriate information and store it longer than any individual qubit can. “This is the first time a quantum device has been built that is capable of correcting its own errors,” said Fowler. For the kind of complex calculations the researchers envision for an actual quantum computer, something up to a hundred million qubits would be needed, but before that a robust self-check and error prevention system is necessary. Key to this quantum error detection and correction system is a scheme developed by Fowler, called the surface code. It uses parity information — the measurement of change from the original data (if any) — as opposed to the duplication of the original information that is part of the process of error detection in classical computing. That way, the actual original information that is being preserved in the qubits remains unobserved. Why? Because quantum physics. “You can’t measure a quantum state, and expect it to still be quantum,” explained Barends. The very act of measurement locks the qubit into a single state and it then loses its superpositioning power, he said. Therefore, in something akin to a Sudoku puzzle, the parity values of data qubits in a qubit array are taken by adjacent measurement qubits, which essentially assess the information in the data qubits by measuring around them. “So you pull out just enough information to detect errors, but not enough to peek under the hood and destroy the quantum-ness,” said Kelly. This development represents a meeting of the best in the science behind the physical and the theoretical in quantum computing — the latest in qubit stabilization and advances in the algorithms behind the logic of quantum computing. “It’s a major milestone,” said Barends. “Because it means that the ideas people have had for decades are actually doable in a real system.” The Martinis Group continues to refine its research to develop this important new tool. This particular quantum error correction has been proved to protect against the “bit-flip” error, however the researchers have their eye on correcting the complimentary error called a “phase-flip,” as well as running the error correction cycles for longer periods to see what behaviors might emerge. Martinis and the senior members of his research group have, since this research was performed, entered into a partnership with Google.
<urn:uuid:608a8482-6c7d-4241-88ae-e73642c47b7c>
CC-MAIN-2016-22
http://www.news.ucsb.edu/2015/015060/strength-numbers
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464054915149.6/warc/CC-MAIN-20160524015515-00067-ip-10-185-217-139.ec2.internal.warc.gz
en
0.933935
1,033
3.703125
4
Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information. Even babies think that objects have individual identities. If you show an infant a ball rolling behind a screen, and then a moment later, two balls roll out, the infant looks longer at the expectation-violating event. Long before we're old enough to talk, we have a parietal cortex that does spatial modeling: that models individual animals running or rocks flying through 3D space. And this is just not the way the universe works. The difference is experimentally knowable, and known. Grasping this fact, being able to see it at a glance, is one of the fundamental bridges to cross in understanding quantum mechanics. If you shouldn't start off by talking to your students about wave/particle duality, where should a quantum explanation start? I would suggest taking, as your first goal in teaching, explaining how quantum physics implies that a simple experimental test can show that two electrons are entirely indistinguishable —not just indistinguishable according to known measurements of mass and electrical charge. To grasp on a gut level how this is possible, it is necessary to move from thinking in billiard balls to thinking in configuration spaces; and then you have truly entered into the true and quantum realm. If the probability distribution over this 2D configuration space of two classical 1D particles, looks like a rectangular plaid pattern, then it will factorize into a distribution over A times a distribution over B. In classical physics, the particles A and B are the fundamental things, and the configuration space is just an isomorphic way of looking at them. In quantum physics, the configuration space is the fundamental thing, and you get the appearance of an individual particle when the amplitude distribution factorizes enough to let you look at a subspace of the configuration space, and see a factor of the amplitude distribution—a factor that might look something like this: This isn't an amplitude distribution, mind you. It's a factor in an amplitude distribution, which you'd have to multiply by the subspace for all the other particles in the universe, to approximate the physically real amplitude distribution. Most mathematically possible amplitude distributions won't factor this way. Quantum entanglement is not some extra, special, additional bond between two particles. "Quantum entanglement" is the general case. The special and unusual case is quantum independence. Reluctant tourists in a quantum universe talk about the bizarre phenomenon of quantum entanglement. Natives of a quantum universe talk about the special case of quantum independence. Try to think like a native, because you are one. I've previously described a configuration as a mathematical object whose identity is "A photon here, a photon there; an electron here, an electron there." But this is not quite correct. Whenever you see a real-world electron, caught in a little electron trap or whatever, you are looking at a blob of amplitude, not a point mass. In fact, what you're looking at is a blob of amplitude-factor in a subspace of a global distribution that happens to factorize. Clearly, then, an individual point in the configuration space does not have an identity of "blob of amplitude-factor here, blob of amplitude-factor there"; so it doesn't make sense to say that a configuration has the identity "A photon here, a photon there." But what is an individual point in the configuration space, then? Well, it's physics, and physics is math, and you've got to come to terms with thinking in pure mathematical objects. A single point in quantum configuration space, is the product of multiple point positions per quantum field; multiple point positions in the electron field, in the photon field, in the quark field, etc. When you actually see an electron trapped in a little electron trap, what's really going on, is that the cloud of amplitude distribution that includes you and your observed universe, can at least roughly factorize into a subspace that corresponds to that little electron, and a subspace that corresponds to everything else in the universe. So that the physically real amplitude distribution is roughly the product of a little blob of amplitude-factor in the subspace for that electron, and the amplitude-factor for everything else in the universe. Got it? 'From the point of view of quantum field theory, particles are identical if and only if they are excitations of the same underlying quantum field. Thus, the question "why are all electrons identical?" arises from mistakenly regarding individual electrons as fundamental objects, when in fact it is only the electron field that is fundamental.' Okay, but that doesn't make the basic jump into a quantum configuration space that is inherently over multiple particles. It just sounds like you're talking about individual disturbances in the aether, or something. As I understand it, an electron isn't an excitation of a quantum electron field, like a wave in the aether; the electron is a blob of amplitude-factor in a subspace of a configuration space whose points correspond to multiple point positions in quantum fields, etc. The difficult jump from classical to quantum is not thinking of an electron as an excitation of a field. Then you could just think of a universe made up of "Excitation A in electron field over here" + "Excitation B in electron field over there" + etc. You could factorize the universe into individual excitations of a field. Your parietal cortex would have no trouble with that one—it doesn't care whether you call the little billiard balls "excitations of an electron field" so long as they still behave like little billiard balls. The difficult jump is thinking of a configuration space that is the product of many positions in many fields, without individual identities for the positions. A configuration space whose points are "a position here in this field, a position there in this field, a position here in that field, and a position there in that field". Not, "A positioned here in this field, B positioned there in this field, C positioned here in that field" etc. You have to reduce the appearance of individual particles to a regularity in something that is different from the appearance of particles, something that is not itself a little billiard ball. Oh, sure, thinking of photons as individual objects will seem to work out, as long as the amplitude distribution happens t factorize. But what happens when you've got your "individual" photon A and your "individual" photon B, and you're in a situation where, a la Feynman paths, it's possible for photon A to end up in position 1 and photon B to end up in position 2, or for A to end up in 2 and B to end up in 1? Then the illusion of classicality breaks down, because the amplitude flows overlap: In that triangular region where the distribution overlaps itself, no fact exists as to which particle is which, even in principle—and in the real world, we often get a lot more overlap than that. I mean, imagine that I take a balloon full of photons, and shake it up. Amplitude's gonna go all over the place. If you label all the original apparent-photons, there's gonna be Feynman paths for photons A, B, C ending up at positions 1, 2, 3 via a zillion different paths and permutations. The amplitude-factor that corresponds to the "balloon full of photons" subspace, which contains bulges of amplitude-subfactor at various different locations in the photon field, will undergo a continuously branching evolution that involves each of the original bulges ending up in many different places by all sorts of paths, and the final configuration will have amplitude contributed from many different permutations. It's not that you don't know which photon went where. It's that no fact of the matter exists. The illusion of individuality, the classical hallucination, has simply broken down. And the same would hold true of a balloon full of quarks or a balloon full of electrons. Or even a balloon full of helium. Helium atoms can end up in the same places, via different permutations, and have their amplitudes add just like photons. Don't be tempted to look at the balloon, and think, "Well, helium atom A could have gone to 1, or it could have gone to 2; and helium atom B could have gone to 1 or 2; quantum physics says the atoms both sort of split, and each went both ways; and now the final helium atoms at 1 and 2 are a mixture of the identities of A and B." Don't torture your poor parietal cortex so. It wasn't built for such usage. Just stop thinking in terms of little billiard balls, with or without confused identities. Start thinking in terms of amplitude flows in configuration space. That's all there ever is. And then it will seem completely intuitive that a simple experiment can tell you whether two blobs of amplitude-factor are over the same quantum field. Just perform any experiment where the two blobs end up in the same positions, via different permutations, and see if the amplitudes add. Part of The Quantum Physics Sequence Next post: "Identity Isn't In Specific Atoms" Previous post: "Feynman Paths"
<urn:uuid:299f2e89-dd14-428b-a7c0-65790db0313b>
CC-MAIN-2016-22
http://lesswrong.com/lw/pl/no_individual_particles/
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464053209501.44/warc/CC-MAIN-20160524012649-00047-ip-10-185-217-139.ec2.internal.warc.gz
en
0.929886
1,924
3.875
4
(Click player button to hear story now.) by Peter Fotis Kapnistos Space-time is a mathematical coordinate system (3 dimensions of space and 1 of time) in which physical events are located in a single continuum. According to Einstein’s Theory of Relativity, gravitation is the “curvature” of space-time. In other words, because an object’s mass makes the curve of space-time bend like a basin in its region, its gravitational force is amplified and attracts other nearby masses. Things are going well up to this point. But imagine for a moment if you could undertake a sudden “reversal” of gravitation. Would you also experience a swift U-turn of space-time? If the force of the Earth’s gravitation is initially low under your feet, but abruptly gets reversed to a point high above your head, what kind of space-time turnaround might you undergo? The reversal of space-time has far-reaching implications. It involves traveling into the past and relegating a great expanse to a tiny step. It’s the stuff of wormholes and Einstein-Rosen bridges. An Einstein-Rosen bridge is a geometrical property of a black hole that manifests itself as a “throat” attached to another set of dimensions or to another universe. In two recent experiments at CERN (the Swiss site of the Large Hadron Collider) a neutrino beam was clocked traveling 60 nanoseconds faster than the velocity of light. The neutrinos seemingly traveled back in time (as if they could arrive at a destination before they even left). If the CERN experiments prove to be accurate, they may unlock the possibility of time travel into the past — or of convenient travel to other stars. In the early stages of our solar system’s formation, fragments of matter were fiercely flung apart, but remained in “quantum entanglement” or superposition. Quantum entanglement is a phenomenon that connects two particles in such a way that changes to one of the particles are instantaneously reflected in the other, although they may seem physically separated by several light years. Einstein described superposition as “spooky action at a distance.” Some clusters of ejected stellar mass eventually merged into planets and their moons. But numerous particles continued in quantum entanglement, because they shared the identical superposition, connected by Einstein-Rosen bridges (or stretched-out space-time wormholes). Since this bond took place at the beginning stages, matter in entanglement is more likely to be found in the interior or close to the core of a planet. The Hollow Earth hypothesis, first put forward in 1692 by the English astronomer Edmund Halley, proposes that the planet Earth is either completely hollow or encloses an extensive interior space. The hollow Earth supposedly contains a small interior sun. There are said to be entrances at the north and south poles. During World War II Hitler sent an expedition to the Baltic Island of Rugen to search for proof of a hollow Earth. Today, that theory has been extended to suggest that the hollow space that connects the north and south poles is really the throat of a space-time wormhole, and the interior sun is actually a rotating black hole, which is prevented by an event horizon from crunching the Earth. * * * Laura Magdalene Eisenhower is the great granddaughter of former US president Dwight Eisenhower. She claims that world leaders have made close contact with aliens. Laura said the US has established covert extraterrestrial bases. She revealed that in 2006 and 2007, she was invited to join a secret American “colony on Mars.” Andrew D. Basiago and William Stillings just reported in the website “Exopolitics” that in the past they had stepped through time and space for the US Department of Defense. They referred to a covered up CIA program hosted at a California community college. Between 1981 and 1983, Barack Obama is said to have “visited Mars” with them by means of a teleportation chamber called a jump room. Regina Dugan, the director of Darpa, allegedly was another member. * * * In the autumn of 2009, a veteran intelligence operative, reactivated into defense agency programs, walked through the old-world streets of an eastern Mediterranean city. He gazed down and visualized structures deep under the pavement, to look into history. In the past, he had performed groundbreaking experiments with remote exploration. Now, enormously behind him, a funicular tunnel of steel tracks drew railway carriages by cable through the base of a cliff and up to its peak. It offered a convincing display that the ancient city was hollow within. A series of complex bunkers deep inside the rock-face installation encircled a subterranean engine that powered a cabled hoisting machine. Like the exotic Berghof elevator, the construction inspired by Bavarian masons and architects was suggestive of Nazi Germany’s suspected National Southern Redoubt, an inner stronghold from which Germany would retaliate. The Allies, who later said the Redoubt fortress existed only in the German imagination, searched for Nazi atomic weapons near the Mediterranean: “Here, defended by nature and by the most efficient secret weapons yet invented, the powers that have hitherto guided Germany will survive to reorganize her resurrection; here armaments will be manufactured in bombproof factories, food and equipment will be stored in vast underground caverns and a specially selected corpus of young men will be trained in guerrilla warfare, so that a whole underground army can be fitted and directed to liberate Germany from occupying forces.” (Supreme Headquarters Allied Expeditionary Force, Weekly Intelligence Summary, March 11, 1945) In his mind’s eye, the intelligence guardian pictured dugout walls, and the slab of a radiation shield with an air lock that opened like a submarine door. Several such doors were lined up along the dark passage where a tunnel of metal rails descended into the crater’s abyss. Behind the doors were the jump rooms of quantum entanglement, connected by Einstein-Rosen bridges. One room shared the same superposition as a region within the interior of Mars. Behind another was a quantum-string conduit to the interior of Venus. As the global elite played and exulted the top-secret enigma of their ancient city, its local residents were forced into a poverty and destitution that threatened to wipe out the Euro. Would the world at last awaken and comprehend that for millennia humans and celestial messengers traveled through the buried gates and jump rooms of quantum entanglement?
<urn:uuid:b8032672-d545-4f9e-9f16-dba45d3fbca7>
CC-MAIN-2016-22
http://myth-os.com/2012/01/14/hollow-earth-wormhole-to-mars/
s3://commoncrawl/crawl-data/CC-MAIN-2016-22/segments/1464049270798.25/warc/CC-MAIN-20160524002110-00098-ip-10-185-217-139.ec2.internal.warc.gz
en
0.950218
1,368
3.671875
4