text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
Could cloud quantum computing be possible? Google wants to make it happen, although there are some who doubt it will happen anytime soon. Potential benefits of quantum computing Quantum computing could be a large step above the computers we typically use today. They are made up of quantum bits, or qubits, that not only process information as ones or zeros, but also any state in between. It uses this mechanism to solve problems and attempt to work more quickly than previously possible. Unfortunately, technology hasn’t given us a fully functional, available quantum computer yet. However, its future potential means that, according to Jerry Chow, a member of IBM’s experimental quantum computing department, “at 50 qubits, universal quantum computing would reach that inflection point and be able to solve problems existing computers can’t handle.” This future might be closer than we think. IBM has plans to construct and distribute this 50-qubit system within the next few years, while Google is projecting that they’ll complete a 49-qubit system by the end of this year. One of the real-life uses of quantum computing is for pharmaceutical science. Right now, there’s a struggle when it comes to understanding how each structure bonds together. It takes complex computer simulations to understand the atomic and subatomic motion when creating new drugs. Solving this could result in lower-cost and better drugs. Scott Crowder from IBM explained that “you don’t even ask those questions on a classical computer because you know you’re going to get it wrong.” Once quantum computing hits its prime, though, medicine can potentially be made much more quickly and at much lower prices. Another problem solved with quantum computing is one that you wouldn’t expect: fertilizer production, according to computing sciences fellow at Lawrence Berkeley National Laboratory, Jarrod McClean. Just making “mass-produced fertilizer accounts for one percent to two percent of the world’s energy use per year.” However, there is a much more energy-efficient option. The problem, according to McClean, is that “it’s been too challenging for classical systems to date” to help researchers create this energy-efficient option in the lab, but he has high hopes for quantum computers’ abilities in the near future to accomplish it. And, the application doesn’t have to be revolutionary to be helpful. In fact, these computers could help organize delivery routes, especially during particularly busy times like Christmas, by organizing thousands of self-driving cars (assuming they will be commonly used in the future). Quantum computers could also help translation software or other small, but productive uses. The potential of quantum computing is almost endless, from finance to economy to energy. It is beginning to become available to certain people right now and is expected to become more mainstream soon (although there are debates about when, exactly). But can Google bring it to the cloud? What is Google doing While Google has been working on quantum computing for years, they’ve just recently started looking at how to turn it into a business. In fact, Google has already started offering “science labs and artificial intelligence researchers early access to its quantum machines over the Internet in recent months.” Their motivation, according to Bloomberg, for giving this early access is so the research will build more tools to go with this technology, helping to make their cloud quantum computing service as fast and powerful as possible. Google is also considering a ProjectQ, or “an open-source effort to get developers to write code for quantum computers.” According to a quantum computing researcher at Stanford University, Google is not trying to hide that “they’re building quantum hardware and they would, at some point in the future, make it a cloud service.” Additionally, according to scientist Jonathan DuBois at Lawrence Livermore National Laboratory, Google “pledged that government and academic researchers would get free access.” While there’s still quite a bit of debate about when quantum computers will actually be usable, Google’s efforts could skyrocket them to the top of the ongoing cloud wars. If what many companies are predicting, processing tasks could become millions of times faster. Offering cloud quantum computing is an extremely smart business decision, considering that these computers are very large and hard to contain so very few companies could have them themselves. As of right now, Google rents storage by the minute, so if their quantum computers can cut the compute time by such a large percentage, they would have a huge price advantage over the competition. Google’s cloud compute prices are currently higher than Amazon’s and Microsoft’s for most instances. Unfortunately, though, we may be getting ahead of ourselves. Seth Lloyd, a professor at the Massachusetts Institute of Technology, argued that useful applications won’t arrive until a system has at least 100 qubits, although, other researchers and organizations seem to disagree. While Google announced their quantum computing efforts back in 2014, they claimed that they would prove their “supremacy” by performing equal to or better than supercomputers by the end of 2017. Of course, though, Google isn’t the only one going after quantum computers. IBM already offers access to their specialized quantum computing platform and plans to create a 50-qubit quantum system within the next five years. This past May, they added a 17 qubit prototype quantum processor to their service as well, although it’s still in its experimental phase. The future of cloud quantum computing Chad Rigetti, founder of Rigetti Computing, which has netted over $69 million from investors for quantum computing software and equipment, believes that quantum computing will become as popular as AI is now, although he isn’t sure exactly when. “This industry is very much in its infancy,” Rigetti said. “No one has built a quantum computer that works.” Hopefully, the future of cloud quantum computing will be here sooner rather than later. Scientists believe that its applications are almost endless, from “improving the work of solar panels, creating medicines, and even fertilizers.” With the numerous applications, faster speeds, and potentially lower prices, cloud quantum computing could revolutionize technology. Photo credit: Pixabay
<urn:uuid:8dacbd72-10bf-4373-a245-e4f3b86e229d>
CC-MAIN-2022-49
https://techgenix.com/google-cloud-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710962.65/warc/CC-MAIN-20221204040114-20221204070114-00550.warc.gz
en
0.957107
1,321
3.65625
4
We live in a time where the phrase “artificial intelligence” (called AI for short) is trendy and appears in the marketing descriptions of many products and services. But what is precisely AI? Broadly speaking, AI originated as an idea to create artificial “thinking” along the lines of the human brain. As of today, however, we can only make assumptions about how the human brain works, primarily based on medical research and observation. From a medical point of view, we know that the brain looks like a complex network of connections in which neurons are the main element and that our thoughts, memory, and creativity are a flow of electrical impulses. This knowledge has given hope to construct an analogous brain in an electronic version, either hardware or software, where neurons are replaced by electronics or software. However, since we are not 100% sure exactly how the brain works, all current models in AI are certain mathematical approximations and simplifications, serving only certain specific uses. Nevertheless, we know from observation that it is possible, for example, to create solutions that mimic the mind quite well – they can recognize the writing, images (objects), music, emotions, and even create art based on previously acquired experiences. However, the results of the latter are sometimes controversial. What else does AI resemble the human brain in? Well… it has to learn! AI solutions are based on one fundamental difference from classical algorithms: the initial product is a philosophical “tabula rasa”, or “pure mind”, which must first be taught. In the case of complex living organisms, knowledge emerges with development: the ability to speak, to move independently, to name objects, and in the case of humans and some animal species, there are elements of learning organized in kindergartens, schools, universities, and during work and independent development. Analogously in most artificial intelligence solutions – the AI model must first receive specific knowledge, most often in the form of examples, to be able to later function effectively as an “adult” algorithm. Some of the solutions learn once, while others improve their knowledge while functioning (Online Learning, or Reinforced Learning). It vividly resembles the human community: some people finish their education and work for the rest of their lives in one company doing one task. Others have to train throughout their lives as their work environment changes dynamically. Is AI already “smarter” than humans? As an interesting aside, we can compare the “computing power” of the brain versus the computing power of computers. It, of course, will be a simplification because the nature of the two is quite different. First, how many neurons does the average human brain have? It was initially estimated to be around 100 billion neurons. However, according to recent research (https://www.verywellmind.com/how-many-neurons-are-in-the-brain-2794889), the number of neurons in the “average” human brain is “slightly” less, by about 14 billion, or 86 billion neuronal cells. For comparison, the brain of a fruit fly is about 100 thousand neurons, a mouse 75 million neurons, a cat 250 million, a chimpanzee 7 billion. An interesting fact is an elephant’s brain (much larger than a human in terms of size), which has … 257 billion neurons, which is definitely more than the brain of a human. From medical research, we know that for each neuron, there are about 1000 connections with neighboring neurons or so-called synapses, so in the case of humans, the total number of connections is around 86 trillion (86 billion neurons * 1000 connections). Therefore, in simplified terms, we can assume that each synapse performs one “operation”, analogous to one instruction in the processor. At what speed does the brain work? In total … not much. We can determine it based on BCI type interfaces (Brain-Computer Interface), which not so long ago appeared as a result of the development of medical devices for electroencephalography (EEG), such as armbands produced by Emotiv, thanks to which we can control the computer using brain waves. Of course, they do not integrate directly with the cerebral cortex but measure activity by analyzing electrical signals. Based on this, we can say that the brain works at variable speed (analogous to the Turbo mode in the processor), and it is between 0.5Hz for the so-called delta state (complete rest) and about 100Hz for the gamma state (stress, full tension). Thus, we can estimate the maximum computational power of the brain as 8.6 billion operations (8.6*10^15) or 8.6 Petaflops! Despite the relatively slow performance of the brain, this is a colossal number thanks to the parallelization of operations. From Wikipedia (https://en.wikipedia.org/wiki/Supercomputer), we learn that supercomputers did not break this limit until the first decade of the 21st century. The situation will change with the advent of quantum computers, which inherently work in parallel, just like the human brain. However, as of today, quantum computing technology for cyber threat hunting is still in its infancy. In conclusion, at the moment, AI has not yet overtaken the human brain, but it probably will someday. However, we are only talking about learning speed here, leaving aside the whole issue of creativity, “coming up with” ideas, emotions, etc. AI and mobile devices Artificial intelligence applications require substantial computational power, especially at the so-called learning stage, and pose a significant challenge in integrating them with AR and VR solutions. Unfortunately, AR and VR devices mostly have very limited resources, as they are effectively ARM processor-based mobile platforms comparable in performance to smartphones. As a result, most artificial intelligence models are so computationally (mathematically) complex that they cannot be trained directly on mobile devices. OK – you can, but it will take an incredibly and unacceptably long time. So in most cases, to learn models, we use powerful PCs (clusters) and GPU gas pedals, mainly Nvidia CUDA. This knowledge is then “exported” into a simplified model “implanted” into AR and VR software or mobile hardware. In our next blog post, you’ll learn how we integrated AI into VR and AR, how we dealt with the limited performance of mobile devices, and what we use AI for in AR and VR.
<urn:uuid:17d98485-deab-4893-b18f-9c38e17b4b00>
CC-MAIN-2022-49
https://itsilesia.com/a-brief-overview-of-what-artificial-intelligence-is/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710902.80/warc/CC-MAIN-20221202114800-20221202144800-00109.warc.gz
en
0.944918
1,351
3.703125
4
Many protocols like SSH, OpenPGP, S/MIME, and SSL/TLS rely on RSA encryption where access to data is secured with two keys. The encryption key is public and differs from the decryption key which is kept secret. The cryptosystem’s reliability exploits the fact that factoring large primes takes years to do even for today’s fastest supercomputers, so protocols based on RSA have proven paramount to anything from processing payments to storing classified intelligence. RSA, however, might become obsolete soon as quantum computer system become stabler and more efficient. Using only five atoms, a team of international researchers showed how to factor a prime, albeit a trivial one for demo purposes. The researchers say there aren’t any physical restrictions that might hinder scalability. Theoretically, more atoms could be added in the process and large primes could be solved at lightning speed. That doesn’t make the engineering challenges easy, though. RSA was first described in 1977 by Ron Rivest, Adi Shamir and Leonard Adleman of the Massachusetts Institute of Technology. In this asymmetric cryptography, two different but mathematically linked keys, one public and one private, are used to decrypt a message. The public key, which anyone can see and use to encrypt a message, is based on the product of two large primes, and an auxiliary exponential. Multiplying two large primes to an integer is easy, but determining the original primes that make the product with no other info is very difficult. In 1994, Peter Shor, the Morss Professor of Applied Mathematics at MIT, came up with a quantum algorithm that calculates the prime factors of a large number, vastly more efficiently than a classical computer. To actually run the algorithm though a quantum computer would require many qubits or quantum bits. In conventional computers, operations that transform inputs into outputs work with bits which can be 0s or 1s. Qubits are atomic-scale units that can be 0 and 1 at the same time — a state known as a superposition. What this means is that a quantum computer can essentially carry out two calculations in parallel. A system that works with qubits can be not twice but millions of times faster than a conventional computer. Previously, scientists designed quantum computers that could factor the number 15 (primes 3 and 5), but these couldn’t be scaled to factor larger numbers. “The difficulty is to implement [the algorithm] in a system that’s sufficiently isolated that it can stay quantum mechanical for long enough that you can actually have a chance to do the whole algorithm,” Isaac Chuang, professor of physics and professor of electrical engineering and computer science at MIT. Chuang and colleagues at MIT and the University of Innsbruck in Austria claim they not only found a way to make a quantum system scalable, but also more efficient. Typically, it took 12 qubits to factor the number 15. The researchers factored the same number using only five qubits or atoms. These five atoms are held in an ion trap, which removes an electron from each atom thereby charging it. The system is stabilized by holding the atoms in place with a magnetic field. Logic gates operations are performed using laser pulses on four of the atoms, while the fifth is used to store or extract results. Using the fifth atom to store information was the brilliant part. “Measuring a qubit knocks it out of superposition and thereby destroys the information it holds. Restricting the measurement step to the fifth ion kept the four involved in the computation from being corrupted,” wrote Amy Nordrum in an article for IEEE. The number 15, albeit trivial to solve, is the smallest that can meaningfully demonstrate Shor’s algorithm. A working system developed at University of Innsbruck factored the number with a confidence exceeding 99 percent, as reported in the journal Science. “In future generations, we foresee it being straightforwardly scalable, once the apparatus can trap more atoms and more laser beams can control the pulses,” Chuang says. “We see no physical reason why that is not going to be in the cards.” To decrypt a typical 1024-bit key, the same system would need thousands of qubits or simultaneous laser pulses. This is doable, but highly challenging and it might take a long time before you can use a quantum computer to break RSA. Moreover, many researchers are already aware of the limitations of current cryptosystem and are preparing for the future: “quantum-resistant public-key algorithms”. “Continued advances in quantum computing will draw broad attention to the threat it represents to all of today’s widely used public-key cryptosystems – the cryptography that underlies electronic commerce and secure communications on the Internet. The security community will begin planning the migration to new `quantum-resistant’ public-key cryptosystems for which quantum computers provide no computational advantage,” said Brian LaMacchia, Director, Security & Cryptography, Microsoft Research.
<urn:uuid:3c35d98e-95fe-4369-bc85-d892cdc31846>
CC-MAIN-2022-49
https://dev.zmescience.com/tech/quantum-computers-encryption/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710918.58/warc/CC-MAIN-20221203011523-20221203041523-00589.warc.gz
en
0.932423
1,050
3.875
4
The original University of Chicago “uchicago news” article by Louise Lerner can be read here. Researchers used the U.S. Department of Energy’s Advanced Photon Source (APS) to help them invent an innovative way for different types of quantum technology to “talk” to each other using sound. The study, published in Nature Physics, is an important step in bringing quantum technology closer to reality. Scientists are eyeing quantum systems, which tap the quirky behavior of the smallest particles, as the key to a fundamentally new generation of atomic-scale electronics for computation and communication. But a persistent challenge has been transferring information between different types of technology, such as quantum memories and quantum processors. “We approached this question by asking: Can we manipulate and connect quantum states of matter with sound waves?” said senior study author David Awschalom, the Liew Family Professor with the Institute for Molecular Engineering and senior scientist at Argonne National Laboratory. One way to run a quantum computing operation is to use “spins”—a property of an electron that can be up, down or both. Scientists can use these like zeroes and ones in today’s binary computer programming language. But getting this information elsewhere requires a translator, and scientists thought sound waves could help. “The object is to couple the sound waves with the spins of electrons in the material,” said graduate student Samuel Whiteley, the co-first author on the Nature Physics paper. “But the first challenge is to get the spins to pay attention.” So they built a system with curved electrodes to concentrate the sound waves, like using a magnifying lens to focus a point of light. The results were promising, but the researchers from The University of Chicago, and Argonne National Laboratory needed more data. To get a better look at what was happening, they worked with scientists at the Center for Nanoscale Materials (CNM) at Argonne to observe the system in real time. Essentially, they used extremely bright, powerful x-rays from the CNM/X-ray Science Division 26-ID-C x-ray beamline at the Advanced Photon Source as a microscope to peer at the atoms inside the material as the sound waves moved through it at nearly 7,000 kilometers per second. (Both the CNM and the APS are Office of Science user facilities at Argonne.) “This new method allows us to observe the atomic dynamics and structure in quantum materials at extremely small length scales,” said Awschalom. “This is one of only a few locations worldwide with the instrumentation to directly watch atoms move in a lattice as sound waves passes through them.” One of the many surprising results, the researchers said, was that the quantum effects of sound waves were more complicated than they’d first imagined. To build a comprehensive theory behind what they were observing at the subatomic level, they turned to Prof. Giulia Galli, the Liew Family Professor at the IME and a senior scientist at Argonne. Modeling the system involves marshalling the interactions of every single particle in the system, which grows exponentially, Awschalom said, “but Professor Galli is a world expert in taking this kind of challenging problem and interpreting the underlying physics, which allowed us to further improve the system.” It’s normally difficult to send quantum information for more than a few microns, said Whiteley—that’s the width of a single strand of spider silk. This technique could extend control across an entire chip or wafer. “The results gave us new ways to control our systems, and opens venues of research and technological applications such as quantum sensing,” said postdoctoral researcher Gary Wolfowicz, the other co-first author of the study. The discovery is another from the University of Chicago’s world-leading program in quantum information science and engineering; Awschalom is currently leading a project to build a quantum “teleportation” network between Argonne and Fermi National Accelerator Laboratory to test principles for a potentially un-hackable communications system. The scientists pointed to the confluence of expertise, resources and facilities at the University of Chicago, Institute for Molecular Engineering and Argonne as key to fully exploring the technology. “No one group has the ability to explore these complex quantum systems and solve this class of problems; it takes state-of-the-art facilities, theorists and experimentalists working in close collaboration,” Awschalom said. “The strong connection between Argonne and the University of Chicago enables our students to address some of the most challenging questions in this rapidly moving area of science and technology.” See: Samuel J. Whiteley1, Gary Wolfowicz1,2, Christopher P. Anderson1, Alexandre Bourassa1, He Ma1, Meng Ye1, Gerwin Koolstra1, Kevin J. Satzinger1,3, Martin V. Holt4, F. Joseph Heremans1,4, Andrew N. Cleland1,4, David I. Schuster1, Giulia Galli1,4, and David D. Awschalom1,4*, “Spin–phonon interactions in silicon carbide addressed by Gaussian acoustics,” Nat. Phys., published on line 11 February 2019. DOI: 10.1038/s41567-019-0420-0 Author affiliations: 1The University of Chicago, 2Tohoku University, 3University of California, Santa Barbara, 4Argonne National Laboratory The devices and experiments were supported by the Air Force Office of Scientific Research; material for this work was supported by the U.S. Department of Energy (DOE). Use of the Center for Nanoscale Materials, an Office of Science user facility, was supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, under Contract No. DE-AC02-06CH11357. S.J.W. and K.J.S. were supported by the NSF GRFP, C.P.A. was supported by the Department of Defense through the NDSEG Program, and M.V.H., F.J.H., A.N.C., G.G. and D.D.A. were supported by the U.S. DOE Office of Science-Basic Energy Sciences. This work made use of the UChicago MRSEC (NSF DMR-1420709) and Pritzker Nanofabrication Facility, which receives support from the SHyNE, a node of the NSF’s National Nanotechnology Coordinated Infrastructure (NSF ECCS-1542205). This research used resources of the Advanced Photon Source, a U.S. Department of Energy (DOE) Office of Science User Facility operated for the DOE Office of Science by Argonne National Laboratory under Contract No. DE-AC02-06CH11357. Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science. The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science website.
<urn:uuid:8ded5d1c-e4bf-444d-b6b8-ec67702e289f>
CC-MAIN-2022-49
https://www.aps.anl.gov/APS-Science-Highlight/2019-02-21/sound-waves-let-quantum-systems-talk-to-one-another
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710918.58/warc/CC-MAIN-20221203011523-20221203041523-00591.warc.gz
en
0.918557
1,654
3.671875
4
The 2022 Nobel Prize for Physics was awarded to experimental physicists Alain Aspect, John Clauser, and Anton Zeilinger. The three pioneers conducted groundbreaking research using entangled quantum particles — subatomic particles that behave as if they are linked even when there is nothing between them — a process that Albert Einstein famously called “spooky action at a distance”. “Quantum information science is a vibrant and rapidly developing field,” said Eva Olsson, a member of the Nobel Committee for Physics. “It has broad potential implications in areas such as secure information transfer, quantum computing, and sensing technology.” But it wasn’t always like this. In fact, quantum physics itself was a fiercely debated field. In the 1930s, one of the fiercest clashes in physics history erupted between Albert Einstein on one hand and Niels Bohr and Erwin Schrödinger on the other (all three Nobel laureates). Einstein believed that everything had to be concrete and knowledgeable at a fundamental level, whereas the pioneers of quantum mechanics argued that reality can be uncertain and particles don’t possess certain properties until they are measured. John Clauser initially thought Einstein was right, and in the 1970s, he set out clever experiments to settle the debate. But it didn’t go as planned: in fact, his experiments disproved Einstein and laid the groundwork for a deeper understanding of quantum mechanics — and in particular, quantum entanglement. Quantum entanglement really is a bizarre process. It’s essentially a phenomenon that can occur when some particles (most commonly photons) are linked together in a way that persists no matter how far apart they are in space and they have a state that cannot be described independently of each other. For instance, physical properties such as position, momentum, spin, and polarization can be perfectly correlated between entangled particles even when they are miles away from each other. Basically, you can study one of the entangled particles and gain information about the linked particles as well — a phenomenon that has no equivalent in classical mechanics. “I would not call entanglement ‘one,’ but rather ‘the’ trait of quantum mechanics,” Thors Hans Hansson, a member of the Nobel Committee, quoted Schrödinger as writing in 1935. “The experiments performed by Clauser and Aspect opened the eyes of the physics community to the depth of Schrödinger’s statement, and provided tools for creating and manipulating and measuring states of particles that are entangled although they are far away.” Einstein (and many other physicists) suspected that if the particles are linked, then there must be some ‘hidden variables’ to connect them, or something that would tie them together. Instead, experimental research from the three laureates showed that there is a genuine entanglement that is not owed to other factors. Ironically, Clauser, who runs his own company in California now, recalls how his advisor thought this field of research was a “waste of time” and advised him to focus on something else and warned him against “ruining” his career. Well, as it turns out, the very opposite happened. The trio’s experiments were also previously awarded the Wolf Prize, sometimes considered a precursor to the Nobel Prize. In fact, the three have been considered “favorites” for a Nobel Prize for a decade. However, Zeilinger, who is currently a professor of physics at the University of Vienna, was very eager to point out that the three did not work alone, and dedicated the prize to the young people who helped in doing the work. “This prize is an encouragement to young people,” said Zeilinger. “It would not be possible without more than 100 young people who worked with me over the years.” Zeilinger also gave some advice to young researchers, echoing the thoughts of Dennis Sullivan, the 2022 Abel Prize laureate (in mathematics): “Do what you find interesting, and don’t care too much about possible applications.” But it should also be said that the trio’s Nobel Prize also considered the applications of their experiments. While the field of quantum mechanics seems ethereal and removed from everyday life, researchers are increasingly finding applications for this technology. For starters, the quantum computers that have so much promise to solve complex problems are based on quantum processes studied by the three physicists. Another application is quantum communications, a technology with security that promises to be nigh-unbreakable. For instance, a research group from China managed to beam up entangled pairs of photons to a satellite, proving that entanglement can survive trips of over 1,000 kilometers — that group was spearheaded by one of Zeilinger’s former students. This type of quantum voyage paves the way for securing messages with a “quantum key” that gets destroyed any time someone attempts to eavesdrop and intercept the information. Basically, this could mean essentially unbreakable cryptography. However, while the field is growing rapidly and it has a lot of potential, there is much we still don’t know about entanglement. In theory, everything could be entangled, but practically, the process seems chaotic and random, and the largest experiments have entangled around a dozen photons. Another project has entangled around a thousand atoms with a single photon. In 2021, the Nobel Prize for Physics was awarded to three researchers who study complex systems that are particularly important for climate science. Earlier this week, the Nobel committee awarded the Physiology or Medicine prize to Svante Pääbo for his many contributions “concerning the genomes of extinct hominins and human evolution.” All Nobel Prizes come with a cash reward worth 10 million Swedish krona ($920,000); if there are multiple laureates, the reward is shared. Andrei's background is in geophysics, and he's been fascinated by it ever since he was a child. Feeling that there is a gap between scientists and the general audience, he started ZME Science -- and the results are what you see today.
<urn:uuid:a43a902e-063a-4ee8-9627-0f60acc1abef>
CC-MAIN-2022-49
https://www.zmescience.com/science/physics/nobel-prize-awarded-to-quantum-physicists-that-studied-spooky-action/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710924.83/warc/CC-MAIN-20221203043643-20221203073643-00711.warc.gz
en
0.962772
1,289
3.5
4
The laws of physics, among the greatest discoveries of humankind, have emerged over many centuries in a process often influenced by the prominent thinkers of the time. This process has had a profound influence on the evolution of science and gives the impression that some laws could not have been discovered without the knowledge of earlier ages. Quantum mechanics, for example, is built on classical mechanics using various mathematical ideas that were prominent at the time. But perhaps there is another way of discovering the laws of physics that does not depend on the understanding we have already gained about the universe. Today Raban Iten, Tony Metger, and colleagues at ETH Zurich in Switzerland say they have developed just such a method and used it to discover laws of physics in an entirely novel way. And they say it may be possible to use this method to find wholly new formulations of physical laws. First, some background. The laws of physics are simple representations that can be interrogated to provide information about more complex scenarios. Imagine setting a pendulum in motion and asking where the base of the pendulum will be at some point in the future. One way to answer this is by measuring the position of the pendulum as it swings. This data can then be used as a kind of look-up table to find the answer. But the laws of motion provide a much easier way of discovering the answer: simply plug values for the various variables into the appropriate equation. That gives the correct answer too. That’s why the equation can be thought of as a compressed representation of reality. This immediately suggests how neural networks might find these laws. Given some observations from an experiment—a swinging pendulum, for example—the goal is to find some simpler representation of this data. The idea from Iten, Metger, and co is to feed this data into the machine so it learns how to make an accurate prediction of the position. Once the machine has learned this, it can then predict the position from any initial set of conditions. In other words, it has learned the relevant law of physics. To find out whether this works, the researchers feed data from a swinging-pendulum experiment into a neural network they call SciNet. They go on to repeat this for experiments that include the collision of two balls, the results of a quantum measurement on a qubit, and even the positions of the planets and sun in the night sky. The results make for interesting reading. Using the pendulum data, SciNet is able to predict the future frequency of the pendulum with an error of less than 2 percent. What’s more, Iten, Metger, and co are able to interrogate SciNet to see how it arrives at the answer. This doesn’t reveal the precise equation, unfortunately, but it does show that the network uses only two variables to come up with the solution. That’s exactly the same number as in the relevant laws of motion. But that isn’t all. SciNet also provides accurate predictions of the angular momentum of two balls after they have collided. That’s only possible using the conservation of momentum, a version of which SciNet appears to have discovered. It also predicts the measurement probabilities when a qubit is interrogated, clearly using some representation of the quantum world. Perhaps most impressive is that the network learns to predict the future position of Mars and the sun using the initial position as seen from Earth. That’s only possible using a heliocentric model of the solar system, an idea that humans took centuries to hit on. And indeed, an interrogation of SciNet suggest it is has learned just such a heliocentric representation. “SciNet stores the angles of the Earth and Mars as seen from the Sun in the two latent neurons—that is, it recovers the heliocentric model of the solar system,” say the researchers. That’s impressive work, but it needs to be placed in perspective. This may be the first demonstration that an artificial neural network can compress data in a way that reveals aspects of the laws of physics. But it is not the first time that a computational approach has derived these laws. A few years ago, computer scientists at Cornell University used a genetic algorithm that exploits the process of evolution to derive a number of laws of physics from experimental data. These included conservation laws for energy and momentum. The system even spat out the equation itself, not just a hint about how it was calculating, as SciNet does. Clearly, evolutionary algorithms have the upper hand in the process of discovering the laws of physics using raw experimental data. (Given that evolution is the process that produced biological neural networks in the first place, it is arguable that it will forever be the more powerful approach.) There is an interesting corollary to all this. It has taken humanity centuries to discover the laws of physics, often in ways that have depended crucially on previously discovered laws. For example, quantum mechanics is based in classical mechanics. Could there be better laws that can be derived from experimental data without any prior knowledge of physics? If so, this machine-learning approach or the one based on evolution should be exactly what’s need to find them. Ref: arxiv.org/abs/1807.10300 : Discovering physical concepts with neural networks Why Meta’s latest large language model survived only three days online Galactica was supposed to help scientists. Instead, it mindlessly spat out biased and incorrect nonsense. A bot that watched 70,000 hours of Minecraft could unlock AI’s next big thing Online videos are a vast and untapped source of training data—and OpenAI says it has a new way to use it. Responsible AI has a burnout problem Companies say they want ethical AI. But those working in the field say that ambition comes at their expense. Biotech labs are using AI inspired by DALL-E to invent new drugs Two groups have announced powerful new generative models that can design new proteins on demand not seen in nature. Get the latest updates from MIT Technology Review Discover special offers, top stories, upcoming events, and more.
<urn:uuid:ee8fcaaa-6a07-4671-b733-cb850af23fd9>
CC-MAIN-2022-49
https://www.technologyreview.com/2018/08/03/2435/who-needs-copernicus-if-you-have-machine-learning/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711390.55/warc/CC-MAIN-20221209043931-20221209073931-00872.warc.gz
en
0.940334
1,282
3.75
4
Exploring the magnetism of a single atom An EPFL-led research collaboration has shown for the first time the maximum theoretical limit of energy needed to control the magnetization of a single atom. The fundamental work can have great implications for the future of magnetic research and technology. Magnetic devices like hard drives, magnetic random access memories (MRAMs), molecular magnets, and quantum computers depend on the manipulation of magnetic properties. In an atom, magnetism arises from the spin and orbital momentum of its electrons. ‘Magnetic anisotropy’ describes how an atom’s magnetic properties depend on the orientation of the electrons’ orbits relative to the structure of a material. It also provides directionality and stability to magnetization. Publishing in Science, researchers led by EPFL combine various experimental and computational methods to measure for the first time the energy needed to change the magnetic anisotropy of a single Cobalt atom. Their methodology and findings can impact a range of fields from fundamental studies of single atom and single molecule magnetism to the design of spintronic device architectures. Magnetism is used widely in technologies from hard drives to magnetic resonance, and even in quantum computer designs. In theory, every atom or molecule has the potential to be magnetic, since this depends on the movement of its electrons. Electrons move in two ways: Spin, which can loosely be thought as spinning around themselves, and orbit, which refers to an electron’s movement around the nucleus of its atom. The spin and orbital motion gives rise to the magnetization, similar to an electric current circulating in a coil and producing a magnetic field. The spinning direction of the electrons therefore defines the direction of the magnetization in a material. The magnetic properties of a material have a certain ‘preference’ or ‘stubbornness’ towards a specific direction. This phenomenon is referred to as ‘magnetic anisotropy’, and is described as the “directional dependence” of a material’s magnetism. Changing this ‘preference’ requires a certain amount of energy. The total energy corresponding to a material’s magnetic anisotropy is a fundamental constraint to the downscaling of magnetic devices like MRAMs, computer hard drives and even quantum computers, which use different electron spin states as distinct information units, or ‘qubits’. The team of Harald Brune at EPFL, working with scientists at the ETH Zurich, Paul Scherrer Institute, and IBM Almaden Research Center, have developed a method to determine the maximum possible magnetic anisotropy for a single Cobalt atom. Cobalt, which is classed as a ‘transition metal’, is widely used in the fabrication of permanent magnets as well as in magnetic recording materials for data storage applications. The researchers used a technique called inelastic electron tunneling spectroscopy to probe the quantum spin states of a single cobalt atom bound to a magnesium oxide (MgO) layer. The technique uses an atom-sized scanning tip that allows the passage (or ‘tunneling’) of electrons to the bound cobalt atom. When electrons tunneled through, they transferred energy the cobalt atom, inducing changes in its spin properties. The experiments showed the maximum magnetic anisotropy energy of a single atom (~60 millielectron volts) and the longest spin lifetime for a single transition metal atom. This large anisotropy leads to a remarkable magnetic moment, which has been determined with synchrotron-based measurements at the X-Treme beamline at the Swiss Light Source. Though fundamental, these findings open the way for a better understanding of magnetic anisotropy and present a single-atom model system that can be conceivably used as a future ‘qubit’. “Quantum computing uses quantum states of matter, and magnetic properties are such a quantum state”, says Harald Brune. “They have a life-time, and you can use the individual suface-adsorbed atoms to make qubits. Our system is a model for such a state. It allows us to optimize the quantum properties, and it is easier than previous ones, because we know exactly where the cobalt atom is in relation to the MgO layer.” This work represents a collaboration between EPFL’s Laboratory of Nanostructures at Surfaces (LNS), IBM’s Almaden Research Center, ETH Zurich’s Department of Materials, Paul Scherrer Institute’s Swiss Light Source, and Georgetown University’s Department of Physics. Rau IG, Baumann S, Rusponi S, Donati F, Stepanow S, Gragnaniello L, Dreiser J, Piamonteze C, Nolting F, Gangopadhyay S, Albertini OR, Macfarlane RM, Lutz CP, Jones B, Gambardella P, Heinrich AJ, Harald Brune. Reaching the magnetic anisotropy limit of a 3d metal atom. Science 08 March 2014 DOI:10.1126/science.1252841
<urn:uuid:29dd3bbf-45e2-4d4e-af64-f70bda99a75c>
CC-MAIN-2022-49
https://actu.epfl.ch/news/exploring-the-magnetism-of-a-single-atom/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711552.8/warc/CC-MAIN-20221209213503-20221210003503-00474.warc.gz
en
0.866622
1,077
3.578125
4
Radhika Iyer – 2022 Teddy Rocks Maths Essay Competition Commended Entry Data transmission is often noisy. Information can get easily garbled, and imperfect information frequently has a cost associated with it. Coding theory is a field of mathematics that deals with trying to make transmission more reliable by using error correcting codes, which are methods of detecting and correcting errors. Throughout this delve into error correcting codes, we will be considering the transmission of strings of binary (or base 2), as every letter, symbol or pixel from an image can be represented as a string of 0s and 1s. Additionally, when a message of length k is sent, there are 2k possible messages being sent, as there are 2 options for each bit. Error correcting codes send more than k binary digits, with these extra digits, called parity digits, helping to detect and correct codes. One example of error correction codes is repetition codes, where we send each message multiple times. For example, if we sent 0011 twice, as 00110011, then the second block of four bits could be compared by the receiver against the first block. A recurrent term in coding theory is information rate (R), with it recording how much information is carried on average for every bit that is sent. For repetition codes, the information rate is often really low, here 0.5 and this can be lower if blocks are sent even more times. Therefore, in practice, we would prefer to use other error correction codes. The weight of a binary sequence is the number of bits in the message that are equal to 1. Parity check codes work by adding a parity check bit at the end that will make a message have even (or odd) weight. For example, consider a scenario where we want an even weight, and the message we are trying to send is 0011. There are 2 (which is an even count) 1s right now, so we add a 0 at the end, so that we have even weight. If we were trying to send 1110, then we would add a 1. This is the equivalent of making the parity digit equal to the sum of all the digits modulo 2. Then, when a message received, if this parity bit does not represent the weight of the message being sent, then a bit might have flipped in transmission. Although we can’t be sure where an error may have occurred, or if an even number of changes occurred, the information rate for this code is 4/5 = 0.8. If we consider trying to find a single-error-detecting code for a message of length 4 bits, as we need to always add a parity bit, an overall message of length 5 is the shortest possible length, so 0.8 is the highest possible information rate for a message of length 4. In general, parity check codes for messages of length n have information rates of R = n/(1 + n). So far, what has been described has been focused on trying to get good error detection, but we still need to correct these errors and try and find the original message. I will be referring to codewords, which are the overall transmissions received that combine our original message and the added bits, that help transmit messages with less likelihood of error. Another important definition is that of Hamming distances, which are the total number of different bits when comparing two codewords. For example, 1010 and 1001 have a Hamming distance of 2, as the last two bits are different. Let’s take a scenario where we are trying to send either a True or a False, with 1 for True and 0 for False. If an error occurred when sending this message, then we would never be able to know if the original message was True or False. Now, let’s say that 11 is True and 00 is False. We still would not be able to know what the original message was for 01, as this has a Hamming distance of 1 from both 00 and 11. Therefore, let’s say that 111 is True and 000 is False. If we received either 101,110, 011, then we could assume that an error had occurred and what was originally sent was True, as there is only one change. Therefore, if we received 001, 010, or 100, then the original message was False. This is known as majority logic. This code is called the (3,1) repetition code, as three bits are sent, with the message of length 1 (0 or 1) being identified. There is a link with geometry here – this is where sphere packing comes in. Sphere packing concerns the arranging of non-overlapping spheres within a space, where we are trying to maximise the total volume of all the spheres that fit within this space. If we consider (0,0,0) and (1,1,1) as points in a 3D space, then all our other possible messages, where only one error has occurred, can also be visualised as vertices of a cube. Two spheres of radius 1 can then be centred at (0,0,0) and (1,1,1). The vertices contained within the sphere can then be interpreted as codewords for the point at the centre of the sphere. In order to fit more original messages being sent, we would like more of these spheres to be packed around points, so that more messages can be transmitted. Sphere packing, where all spheres are disjoint, can be used to find error correcting codes that can always detect the position of errors. Perfect Hamming codes are a well-known class of codes that can correct single bit errors, and these occur when all vertices, in however many dimensions of Euclidean space, can be contained within spheres that have the smallest radius we can make possible. This means that they have attained the Hamming bound, or the sphere-packing bound. Another way of writing this is to say that a perfect code occurs when all vertices are either codewords themselves, or are only one edge (or a Hamming distance of 1) away from a codeword. The most well-known Hamming code is the (7,4) code which uses a ‘generator matrix’ to create three parity bits added to our four bits that make up the message, and is a code that can detect and correct single errors. It is thought to have a relatively high information rate, as the rate is higher than 0.333 for the (3,1) code; the R for the (7,4) code is 4/7 = 0.571 (3 significant figures). What is interesting to note is that (7,4) is the first perfect Hamming code after (3,1). After this, the next perfect code is (15,11). A pattern you may have spotted is that the first number in all of these brackets is one less than a power of two. In order to explain this, let us consider the (7,4) scenario. If we consider code words that are on a 7-dimensional hypercube, every codeword would have 7 edges exiting this point, and so, including itself, there are 8 vertices involved for every message. Now, as we are using messages of length 4, there are 16 possible messages. 16 x 8 is 128 which is 27. On a hypercube in n-dimensions, the total number of vertices is always 2n. This makes having an overall message length of 2n-1 a perfect scenario, as every single vertex is involved with a codeword (2n for each possible message), so there is a most effective use of spheres or space. Perfect Hamming codes are a method of efficiently correcting single bit errors, but it is important to note that these processes are not going to be able to correct all of the bits that may be corrupted in a data transmission error. There are also many other famous codes that I have not delved into, including Reed-Muller codes, the famous Golay (23,12) code that can correct up to three errors, and the Leech lattice in 24-dimensional space. There are also many links between error correcting and probability that have not been mentioned. With an increasing focus on quantum computing and how powerful this field can be, especially when we think about their impact on cryptography, it is interesting to think about qubits, which also transmit information. Qubits can be in any superposition of the states 0 and 1, and will also have errors, but recent research has shown that space-time could be involved in building error correcting code for qubits and quantum computers. Perhaps this will be the area that coding theory focuses on next. Thompson, Thomas M. (2014) – From Error-Correcting Codes Through Sphere Packings to Simple (Chapter 1) doi: 10.5948/UPO9781614440215.002
<urn:uuid:e7e6bb8d-7114-477e-967f-30d557805a07>
CC-MAIN-2022-49
https://tomrocksmaths.com/2022/09/27/understanding-transmissions-with-error-correcting-codes/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710155.67/warc/CC-MAIN-20221127005113-20221127035113-00433.warc.gz
en
0.951237
1,847
3.8125
4
Topological insulators are one of the most puzzling quantum materials – a class of materials whose electrons cooperate in surprising ways to produce unexpected properties. The edges of a TI are electron superhighways where electrons flow with no loss, ignoring any impurities or other obstacles in their path, while the bulk of the material blocks electron flow. Scientists have studied these puzzling materials since their discovery just over a decade ago with an eye to harnessing them for things like quantum computing and information processing. Now researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have invented a new, hands-off way to probe the fastest and most ephemeral phenomena within a TI and clearly distinguish what its electrons are doing on the superhighway edges from what they’re doing everywhere else. The technique takes advantage of a phenomenon called high harmonic generation, or HHG, which shifts laser light to higher energies and higher frequencies – much like pressing a guitar string produces a higher note – by shining it through a material. By varying the polarization of laser light going into a TI and analyzing the shifted light coming out, researchers got strong and separate signals that told them what was happening in each of the material’s two contrasting domains. “What we found out is that the light coming out gives us information about the properties of the superhighway surfaces,” said Shambhu Ghimire, a principal investigator with the Stanford PULSE Institute at SLAC, where the work was carried out. “This signal is quite remarkable, and its dependence on the polarization of the laser light is dramatically different from what we see in conventional materials. We think we have a potentially novel approach for initiating and probing quantum behaviors that are supposed to be present in a broad range of quantum materials.” The research team reported the results today in Physical Review A. Light in, light out Starting in 2010, a series of experiments led by Ghimire and PULSE Director David Reis showed HHG can be produced in ways that were previously thought unlikely or even impossible: by beaming laser light into a crystal, a frozen argon gas or an atomically thin semiconductor material. Another study described how to use HHG to generate attosecond laser pulses, which can be used to observe and control the movements of electrons, by shining a laser through ordinary glass. In 2018, Denitsa Baykusheva, a Swiss National Science Foundation Fellow with a background in HHG research, joined the PULSE group as a postdoctoral researcher. Her goal was to study the potential for generating HHG in topological insulators – the first such study in a quantum material. “We wanted to see what happens to the intense laser pulse used to generate HHG,” she said. “No one had actually focused such a strong laser light on these materials before.” But midway through those experiments, the COVID-19 pandemic hit and the lab shut down in March 2020 for all but essential research. So the team had to think of other ways to make progress, Baykusheva said. “In a new area of research like this one, theory and experiment have to go hand in hand,” she explained. “Theory is essential for explaining experimental results and also predicting the most promising avenues for future experiments. So we all turned ourselves into theorists” – first working with pen and paper and then writing code and doing calculations to feed into computer models. An illuminating result To their surprise, the results predicted that circularly polarized laser light, whose waves spiral around the beam like a corkscrew, could be used to trigger HHG in topological insulators. “One of the interesting things we observed is that circularly polarized laser light is very efficient at generating harmonics from the superhighway surfaces of the topological insulator, but not from the rest of it,” Baykusheva said. “This is something very unique and specific to this type of material. It can be used to get information about electrons that travel the superhighways and those that don't, and it can also be used to explore other types of materials that can’t be probed with linearly polarized light.” The results lay out a recipe for continuing to explore HHG in quantum materials, said Reis, who is a co-author of the study. “It’s remarkable that a technique that generates strong and potentially disruptive fields, which takes electrons in the material and jostles them around and uses them to probe the properties of the material itself, can give you such a clear and robust signal about the material’s topological states,” he said. “The fact that we can see anything at all is amazing, not to mention the fact that we could potentially use that same light to change the material’s topological properties.” Experiments at SLAC have resumed on a limited basis, Reis added, and the results of the theoretical work have given the team new confidence that they know exactly what they are looking for. Researchers from the Max Planck POSTECH/KOREA Research Initiative also contributed to this report. Major funding for the study came from the DOE Office of Science and the Swiss National Science Foundation. Citation: Denitsa Baykusheva et al., Physical Review A, 2 February 2020 (10.1103/PhysRevA.103.023101) For questions or comments, contact the SLAC Office of Communications at firstname.lastname@example.org. SLAC is a vibrant multiprogram laboratory that explores how the universe works at the biggest, smallest and fastest scales and invents powerful tools used by scientists around the globe. With research spanning particle physics, astrophysics and cosmology, materials, chemistry, bio- and energy sciences and scientific computing, we help solve real-world problems and advance the interests of the nation. SLAC is operated by Stanford University for the U.S. Department of Energy’s Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.
<urn:uuid:8558f27d-e383-42c2-82c2-b74976ab6bfd>
CC-MAIN-2022-49
https://www6.slac.stanford.edu/news/2021-02-02-new-hands-probe-uses-light-explore-subtleties-electron-behavior-topological
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710155.67/warc/CC-MAIN-20221127005113-20221127035113-00435.warc.gz
en
0.940012
1,293
3.8125
4
Ultra-thin designer materials unlock quantum phenomena A team of theoretical and experimental physicists have designed a new ultra-thin material that they have used to create elusive quantum states. Called one-dimensional Majorana zero energy modes, these quantum states could have a huge impact for quantum computing. At the core of a quantum computer is a qubit, which is used to make high-speed calculations. The qubits that Google, for example, in its Sycamore processor unveiled last year, and others are currently using are very sensitive to noise and interference from the computer’s surroundings, which introduces errors into the calculations. A new type of qubit, called a topological qubit, could solve this issue, and 1D Majorana zero energy modes may be the key to making them. ‘A topological quantum computer is based on topological qubits, which are supposed to be much more noise tolerant than other qubits. However, topological qubits have not been produced in the lab yet,’ explains Professor Peter Liljeroth, the lead researcher on the project. What are MZMs? MZMs are groups of electrons bound together in a specific way so they behave like a particle called a Majorana fermion, a semi-mythical particle first proposed by semi-mythical physicist Ettore Majorana in the 1930s. If Majorana’s theoretical particles could be bound together, they would work as a topological qubit. One catch: no evidence for their existence has ever been seen, either in the lab or in astronomy. Instead of attempting to make a particle that no one has ever seen anywhere in the universe, researchers instead try to make regular electrons behave like them. To make MZMs, researchers need incredibly small materials, an area in which Professor Liljeroth’s group at Aalto University specialises. MZMs are formed by giving a group of electrons a very specific amount of energy, and then trapping them together so they can’t escape. To achieve this, the materials need to be 2-dimensional, and as thin as physically possible. To create 1D MZMs, the team needed to make an entirely new type of 2D material: a topological superconductor. Topological superconductivity is the property that occurs at the boundary of a magnetic electrical insulator and a superconductor. To create 1D MZMs, Professor Liljeroth’s team needed to be able to trap electrons together in a topological superconductor, however it’s not as simple as sticking any magnet to any superconductor. ‘If you put most magnets on top of a superconductor, you stop it from being a superconductor,’ explains Dr. Shawulienu Kezilebieke, the first author of the study. ‘The interactions between the materials disrupt their properties, but to make MZMs, you need the materials to interact just a little bit. The trick is to use 2D materials: they interact with each other just enough to make the properties you need for MZMs, but not so much that they disrupt each other.’ The property in question is the spin. In a magnetic material, the spin is aligned all in the same direction, whereas in a superconductor the spin is anti-aligned with alternating directions. Bringing a magnet and a superconductor together usually destroys the alignment and anti-alignment of the spins. However, in 2D layered materials the interactions between the materials are just enough to “tilt” the spins of the atoms enough that they create the specific spin state, called Rashba spin-orbit coupling, needed to make the MZMs. Finding the MZMs The topological superconductor in this study is made of a layer of chromium bromide, a material which is still magnetic when only one-atom-thick. Professor Liljeroth’s team grew one-atom-thick islands of chromium bromide on top of a superconducting crystal of niobium diselenide, and measured their electrical properties using a scanning tunneling microscope. At this point, they turned to the computer modelling expertise of Professor Adam Foster at Aalto University and Professor Teemu Ojanen, now at Tampere University, to understand what they had made. ‘There was a lot of simulation work needed to prove that the signal we’re seeing was caused by MZMs, and not other effects,’ says Professor Foster. ‘We needed to show that all the pieces fitted together to prove that we had produced MZMs.’ Now the team is sure that they can make 1D MZMs in 2-dimensional materials, the next step will be to attempt to make them into topological qubits. This step has so far eluded teams who have already made 0-dimensional MZMs, and the Aalto team are unwilling to speculate on if the process will be any easier with 1-dimensional MZMs, however they are optimistic about the future of 1D MZMs. ‘The cool part of this paper is that we’ve made MZMs in 2D materials,’ said Professor Liljeroth ‘In principle these are easier to make and easier to customise the properties of, and ultimately make into a usable device.’ The research collaboration included researchers from Tampere University in Finland, and M.Curie-Sklodowska University in Poland. Published at Thu, 17 Dec 2020 18:53:38 +0000
<urn:uuid:5a9faf25-7538-4492-83a0-00c9a964b1dc>
CC-MAIN-2022-49
https://www.ourgeneration.ca/2020/12/18/ultra-thin-designer-materials-unlock-quantum-phenomena/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710916.40/warc/CC-MAIN-20221202183117-20221202213117-00356.warc.gz
en
0.943114
1,173
3.578125
4
Please read this guest post about the quantum Internet by Stephanie Wehner, Professor at the University of Technology in Delft, The Netherlands. In March 2017, we invited Stephanie Wehner, Professor at QuTech at the Delft University of Technology to give a guest-lecture to RIPE NCC staff about the Quantum Internet project. We were curious to learn about this new technology, its consequences for the "traditional" Internet, and how we can make the connection between cutting-edge research and the RIPE community. The Technical Basics of Quantum Computing The goal of the quantum Internet is to enable transmission of quantum bits (qubits) between any two points on earth in order to solve problems that are intractable classically. Qubits are very different from classical bits in that they can be “0” and “1” at the same time, and cannot be copied. Currently, it is possible to make a transmission over 100km, and run a single application known as quantum key distribution. The next challenge is to go long distance, and to connect small quantum processors to enable a larger range of applications. Thankfully, these quantum processors do not need to be large quantum computers: a handful of qubits are already enough to outperform classical communication. The reason why quantum Internet nodes do not need many qubits to be useful (unlike quantum computers) is that a quantum Internet derives its advantages from quantum entanglement for which even a single qubit can be enough. In contrast, a quantum computer always needs more qubits than can be simulated on a classical supercomputer to be useful. Use-cases for quantum networking currently include: - Secure communication with the help of quantum key distribution - Clock synchronisation - Combining distant telescopes to form one much more powerful telescope - Advantages for classic problems in distributed systems such as achieving consensus and agreement about data distributed in the cloud - Sending exponentially fewer qubits than classical bits to solve some distributed computing problems - Secure access to a powerful quantum computer using only very simple “desktop” quantum devices - Combining small quantum computers to form a larger quantum computing cluster In general, quantum networking exploits two essential features of quantum entanglement: first, quantum entanglement is inherently private – if two network nodes are maximally entangled, then this entanglement is completely shielded from anything else in the universe according to the laws of quantum mechanics. Second, quantum entanglement allows maximal coordination – measuring two qubits that are entangled always results in the same outcome no matter how far they are apart. It is this feature of perfect coordination that gives advantages in, for example, clock synchronisation or even winning online bridge more often using quantum entanglement. Dutch Test-bed Network QuTech at the Delft University of Technology and TNO, in collaboration with the European Quantum Internet Alliance, is leading with the efforts to establish a quantum Internet, and aims to have a demonstration network in 2020 connecting four cities in the Netherlands. This network may be the first of its kind in 2020, and will allow the end to end transmission of qubits between any two network nodes consisting of few qubit processors. The quantum network in The Netherlands Transmitting Qubits Over Long Distances One may wonder why it is difficult to send qubits over long distances. Roughly speaking, one qubit corresponds to just one photon which is easily lost over distance. The technology needed to transmit qubits over long distances is called a quantum repeater. A quantum repeater works very differently than a classical repeater, exploiting the fact that qubits can be transmitted using quantum teleportation. Quantum teleportation works by first creating two entangled qubits between two network nodes. Once the entangled link is created, the qubit to be transmitted can be sent over it. Imagine two network nodes that are 200kms apart – too far for direct transmission. A quantum repeater in the middle works as follows: first two entangled qubits are created between the first endpoint and the repeater. This is possible since this endpoint and the repeater are only 100kms apart. Second, two entangled qubits are created between the repeater and the second endpoint. The repeater then uses quantum teleportation to transfer the qubit that is entangled with the first endpoint to the second endpoint. The end result is end-to-end entanglement between the two endpoints. Qubit data can now be transmitted using this entangled link. The concept of a quantum repeater Involvement with the RIPE Community After this research project is accomplished, industry partners from the RIPE community are needed to take over in order to scale, increase the speed and make this new technology added to the "traditional" Internet, as a parallel service. A quantum Internet also needs significant protocol development to define a networking stack adapted to the transmission of qubits, and the management of entanglement. This requires the help of the RIPE community at large to develop a classical protocol stack to control a quantum Internet and implement protocols to route qubits. Join us at the Open Day at QuTech On 22 June 2017, QuTech is organising a presentation and a tour at the lab at QuTech in Delft, the Netherlands. Here is the programme for the day: 10:00 Presentation - Stephanie 11:00 Start lab tours 12:00 Light lunch & meet & greet - Stephanie Please note that participation is limited to 25 people. Please register HERE if you are interested in participating. If you are interested to learn more, please join us at one of the events listed above, or get in touch with Stephanie and her team. You can also leave a comment below.
<urn:uuid:a6faf6ae-729f-4570-b5d5-bb49fcb459b5>
CC-MAIN-2022-49
https://labs.ripe.net/author/becha/introduction-to-the-quantum-internet/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711360.27/warc/CC-MAIN-20221208183130-20221208213130-00515.warc.gz
en
0.920227
1,173
3.515625
4
Wormhole A wormhole, also known as an Einstein–Rosen bridge, is a hypothetical topological feature of spacetime that would fundamentally be a "shortcut" through spacetime. A wormhole is much like a tunnel with two ends each in separate points in spacetime. For a simplified notion of a wormhole, visualize space as a two-dimensional (2D) surface. Learning and training: statistics and myths How Effective is Training? Laurie Bassi measured how well employees are trained and developed (Delahoussaye, et al., 2002). She writes that organizations that make large investments in people typically have lower employee turnover, which is associated with higher customer satisfaction, which in turn is a driver of profitability (p22). A second driver is manager proficiency — good managers determine if people stay or go, and this is also influenced by training and development. She further writes that the education and training variable is the most significant predictor of an organization's success as compared to price-to-earning ratios, price-to-book statistics, and measures of risk and volatility. Bassi puts her theories to the test — her and a fellow partner launched an investment firm that buys stocks in companies that invest heavily in employee training. New Wormhole Theory Uses Space Photon Energy “Fluid” A new theory expands on other theories and adds photon energy “fluid” as a way to support wormholes. The introduction to the paper states the following. Wormholes are hypothetical geometrical structures connecting two universes or two distant parts of the same universe. For a simple visual explanation of a wormhole, consider spacetime visualized as a two-dimensional (2D) surface. If this surface is folded along a third dimension, it allows one to picture a wormhole “bridge”. “A possible cause of the late-time cosmic acceleration is an exotic fluid with an equation of state lying within the phantom regime, i.e., w = p/ρ < −1. How secure is my password? Entries are 100% secure and not stored in any way or shared with anyone. Period. As Seen On New data confirms: Neutrinos are still traveling faster than light "It is worth pointing out, however, that the latest arXiv preprint lists 179 authors, while the original lists 174. Would you ever classify five people as "most of" 15? To make things more confusing . . . "four new people" have decided not to sign, according to Science. Now, none of the above numbers may match up . . .." The original 174 include a duplicate " F. World Economic Forum. 8 digital skills we must teach our children The social and economic impact of technology is widespread and accelerating. The speed and volume of information have increased exponentially. Experts are predicting that 90% of the entire population will be connected to the internet within 10 years. With the internet of things, the digital and physical worlds will soon be merged. These changes herald exciting possibilities. But they also create uncertainty. Gravitational-wave finding causes 'spring cleaning' in physics Detlev van Ravenswaay/Science Photo Library Artist's rendering of 'bubble universes' within a greater multiverse — an idea that some experts say was bolstered with this week's discovery of gravitational waves. On 17 March, astronomer John Kovac of the Harvard-Smithsonian Center for Astrophysics presented long-awaited evidence of gravitational waves — ripples in the fabric of space — that originated from the Big Bang during a period of dramatic expansion known as inflation. By the time the Sun set that day in Cambridge, Massachusetts, the first paper detailing some of the discovery’s consequences had already been posted online1, by cosmologist David Marsh of the Perimeter Institute for Theoretical Physics in Waterloo, Canada, and his colleagues. Cosmologist Marc Kamionkowski of Johns Hopkins University in Baltimore, Maryland, agrees that some axion models no longer work, “because they require inflation to operate at a lower energy scale than the one indicated by BICEP2”. Quantum world record smashed 14-Nov-2013 [ Print | E-mail ] Share [ Close Window ] Contact: University of Oxford Press email@example.com 44-186-528-3877University of Oxford A normally fragile quantum state has been shown to survive at room temperature for a world record 39 minutes, overcoming a key barrier towards building ultrafast quantum computers. An international team including Stephanie Simmons of Oxford University, UK, report in this week's Science a test performed by Mike Thewalt of Simon Fraser University, Canada, and colleagues. New Experiments to Pit Quantum Mechanics Against General Relativity It starts like a textbook physics experiment, with a ball attached to a spring. If a photon strikes the ball, the impact sets it oscillating very gently. But there’s a catch. Before reaching the ball, the photon encounters a half-silvered mirror, which reflects half of the light that strikes it and allows the other half to pass through. What happens next depends on which of two extremely well-tested but conflicting theories is correct: quantum mechanics or Einstein’s theory of general relativity; these describe the small- and large-scale properties of the universe, respectively. In a strange quantum mechanical effect called “superposition,” the photon simultaneously passes through and reflects backward off the mirror; it then both strikes and doesn’t strike the ball. Most students don't know when news is fake Preteens and teens may appear dazzlingly fluent, flitting among social-media sites, uploading selfies and texting friends. But they’re often clueless about evaluating the accuracy and trustworthiness of what they find. Some 82% of middle-schoolers couldn’t distinguish between an ad labeled “sponsored content” and a real news story on a website, according to a Stanford University study of 7,804 students from middle school through college. The study, set for release Tuesday, is the biggest so far on how teens evaluate information they find online. Many students judged the credibility of newsy tweets based on how much detail they contained or whether a large photo was attached, rather than on the source. More than two out of three middle-schoolers couldn’t see any valid reason to mistrust a post written by a bank executive arguing that young adults need more financial-planning help. Carver Mead's Spectator Interview From American Spectator, Sep/Oct2001, Vol. 34 Issue 7, p68 Carver Mead The Spectator Interview Once upon a time, Nobel Laureate leader of the last great generation of physicists, threw down the gauntlet to anyone rash enough to doubt the fundamental weirdness, the quark-boson-muon-strewn amusement park landscape of late 20th-century quantum physics. "Things on a very small scale behave like nothing you have direct experience about. Visual learning Visual thinking is a learning style where the learner better understands and retains information when ideas, words and concepts are associated with images. Research tells us that the majority of students in a regular classroom need to see information in order to learn it. Some common visual learning strategies include creating graphic organizers, diagramming, mind mapping, outlining and more. New qubit control bodes well for future of quantum computing (Phys.org)—Yale University scientists have found a way to observe quantum information while preserving its integrity, an achievement that offers researchers greater control in the volatile realm of quantum mechanics and greatly improves the prospects of quantum computing. Quantum computers would be exponentially faster than the most powerful computers of today. "Our experiment is a dress rehearsal for a type of process essential for quantum computing," said Michel Devoret, the Frederick William Beinecke Professor of Applied Physics & Physics at Yale and principal investigator of research published Jan. 11 in the journal Science. "What this experiment really allows is an active understanding of quantum mechanics. Education Home of everything Gamification Education -- research, community, case studies and more -- as part of the Gamification.org family of wikis. Want to help us create this website? Contact us! Introduction Education affects everyone.
<urn:uuid:0ac91a65-f973-402b-a423-6de5b14456d2>
CC-MAIN-2022-49
http://www.pearltrees.com/u/97590661-animoto-video-maker-slideshow
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711114.3/warc/CC-MAIN-20221206192947-20221206222947-00836.warc.gz
en
0.925507
1,689
3.59375
4
-By Glenn Roberts Jr. A team led by physicists at Lawrence Berkeley National Laboratory (Berkeley Lab) and UC Berkeley has successfully observed the scrambling of quantum information, which is thought to underlie the behavior of black holes, using qutrits: information-storing quantum units that can represent three separate states at the same time. Their efforts also pave the way for building a quantum information processor based upon qutrits. The black hole information paradox The new study, recently published in the journal Physical Review X, makes use of a quantum circuit that is inspired by the longstanding physics question: What happens to information when it enters a black hole? Beyond the connection to cosmology and fundamental physics, the team’s technical milestones that made the experiment possible represent important progress toward using more complex quantum processors for quantum computing, cryptography, and error detection, among other applications. While black holes are considered one of the most destructive forces in the universe – matter and light cannot escape their pull, and are quickly and thoroughly scrambled once they enter – there has been considerable debate about whether and how information is lost after passing into a black hole. The late physicist Stephen Hawking showed that black holes emit radiation – now known as Hawking radiation – as they slowly evaporate over time. In principle, this radiation could carry information about what’s inside the black hole – even allowing the reconstruction of information that passes into the black hole. And by using a quantum property known as entanglement, it is possible to perform this reconstruction significantly more rapidly, as was shown in earlier work. Quantum entanglement defies the rules of classical physics, allowing particles to remain correlated even when separated by large distances so that the state of one particle will inform you about the state of its entangled partner. If you had two entangled coins, for example, knowing that one coin came up heads when you looked at it would automatically tell you that the other entangled coin was tails, for example. Most efforts in quantum computing seek to tap into this phenomenon by encoding information as entangled quantum bits, known as qubits (pronounced CUE-bits). Like a traditional computer bit, which can hold the value of zero or one, a qubit can also be either a zero or one. But in addition, a qubit can exist in a superposition that is both one and zero at the same time. In the case of a coin, it’s like a coin flip that can represent either heads or tails, as well as the superposition of both heads and tails at the same time. The power of 3: Introducing qutrits Each qubit you add to a quantum computer doubles its computing power, and that exponential increase soars when you use quantum bits capable of storing more values, like qutrits (pronounced CUE-trits). Because of this, it takes far fewer qubits and even fewer qutrits or qudits – which describes quantum units with three or more states – to perform complex algorithms capable of demonstrating the ability to solve problems that cannot be solved using conventional computers. That said, there are a number of technical hurdles to building quantum computers with a large number of quantum bits that can operate reliably and efficiently in solving problems in a truly quantum way. In this latest study, researchers detail how they developed a quantum processor capable of encoding and transmitting information using a series of five qutrits, which can each simultaneously represent three states. And despite the typically noisy, imperfect, and error-prone environment of quantum circuity, they found that their platform proved surprisingly resilient and robust. Qutrits can have a value of zero, one, or two, holding all of these states in superposition. In the coin analogy, it’s like a coin that has the possibility of coming up as heads, tails, or in landing on its thin edge. “A black hole is an extremely good encoder of information,” said Norman Yao, a faculty scientist in Berkeley Lab’s Materials Sciences Division and an assistant professor of physics at UC Berkeley who helped to lead the planning and design of the experiment. “It smears it out very quickly, so that any local noise has an extremely hard time destroying this information.” But, he added, “The encoder is so darn good that it’s also very hard to decode this information.” Creating an experiment to mimic quantum scrambling The team set out to replicate the type of rapid quantum information smearing, or scrambling, in an experiment that used tiny devices called nonlinear harmonic oscillators as qutrits. These nonlinear harmonic oscillators are essentially sub-micron-sized weights on springs that can be driven at several distinct frequencies when subjected to microwave pulses. A common problem in making these oscillators work as qutrits, though, is that their quantum nature tends to break down very quickly via a mechanism called decoherence, so it is difficult to distinguish whether the information scrambling is truly quantum or is due to this decoherence or other interference, noted Irfan Siddiqi, the study’s lead author. Siddiqi is director of Berkeley Lab’s Advanced Quantum Testbed, a faculty scientist in the Lab’s Computational Research and Materials Sciences divisions, and a professor of physics at UC Berkeley. The testbed, which began accepting proposals from the quantum science community in 2020, is a collaborative research laboratory that provides open, free access to users who want to explore how superconducting quantum processors can be used to advance scientific research. The demonstration of scrambling is one of the first results from the testbed’s user program. “In principle, an isolated black hole exhibits scrambling,” Siddiqi said, “but any experimental system also exhibits loss from decoherence. In a laboratory, how do you distinguish between the two?” A key to the study was in preserving the coherence, or orderly patterning, of the signal carried by the oscillators for long enough to confirm that quantum scrambling was occurring via the teleportation of a qutrit. While teleportation may conjure up sci-fi imagery of “beaming up” people or objects from a planet’s surface onto a spaceship, in this case there is only the transmission of information – not matter – from one location to another via quantum entanglement. Another essential piece was the creation of customized logic gates that enable the realization of “universal quantum circuits,” which can be used to run arbitrary algorithms. These logic gates allow pairs of qutrits to interact with each other and were designed to handle three different levels of signals produced by the microwave pulses. One of the five qutrits in the experiment served as the input, and the other four qutrits were in entangled pairs. Because of the nature of the qutrits’ entanglement, a joint measurement of one of the pairs of qutrits after the scrambling circuit ensured that the state of the input qutrit was teleported to another qutrit. Mirrored black holes and wormholes The researchers used a technique known as quantum process tomography to verify that the logic gates were working and that the information was properly scrambled, so that it was equally likely to appear in any given part of the quantum circuit. Siddiqi said that one way to think about how the entangled qutrits transmit information is to compare it to a black hole. It’s as if there is a black hole and a mirrored version of that black hole, so that information passing in one side of the mirrored black hole is transmitted to the other side via entanglement. Looking forward, Siddiqi and Yao are particularly interested in tapping into the power of qutrits for studies related to traversable wormholes, which are theoretical passages connecting separate locations in the universe, for example. A scientist from the Perimeter Institute for Theoretical Physics in Canada also participated in the study, which received supported from the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and Office of High Energy Physics; and from the National Science Foundation’s Graduate Research Fellowship. Founded in 1931 on the belief that the biggest scientific challenges are best addressed by teams, Lawrence Berkeley National Laboratory and its scientists have been recognized with 14 Nobel Prizes. Today, Berkeley Lab researchers develop sustainable energy and environmental solutions, create useful new materials, advance the frontiers of computing, and probe the mysteries of life, matter, and the universe. Scientists from around the world rely on the Lab’s facilities for their own discovery science. Berkeley Lab is a multiprogram national laboratory, managed by the University of California for the U.S. Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science.
<urn:uuid:fec545bf-965d-43ea-9bda-efe89ce6511c>
CC-MAIN-2022-49
https://newscenter.lbl.gov/2021/04/26/going-beyond-quibits/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710870.69/warc/CC-MAIN-20221201221914-20221202011914-00636.warc.gz
en
0.936145
1,880
3.65625
4
A Chinese satellite has split pairs of "entangled photons" and transmitted them to separate ground stations 745 miles (1,200 kilometers) apart, smashing the previous distance record for such a feat and opening new possibilities in quantum communication. In quantum physics, when particles interact with each other in certain ways they become "entangled." This essentially means they remain connected even when separated by large distances, so that an action performed on one affects the other. In a new study published online today (June 15) in the journal Science, researchers report the successful distribution of entangled photon pairs to two locations on Earth separated by 747.5 miles (1,203 km). [The 18 Biggest Unsolved Mysteries in Physics] Quantum entanglement has interesting applications for testing the fundamental laws of physics, but also for creating exceptionally secure communication systems, scientists have said. That's because quantum mechanics states that measuring a quantum system inevitably disturbs it, so any attempt to eavesdrop is impossible to hide. But, it's hard to distribute entangled particles — normally photons — over large distances. When traveling through air or over fiber-optic cables, the environment interferes with the particles, so with greater distances, the signal decays and becomes too weak to be useful. In 2003, Pan Jianwei, a professor of quantum physics at the University of Science and Technology of China, started work on a satellite-based system designed to beam entangled photon pairs down to ground stations. The idea was that because most of the particle's journey would be through the vacuum of space, this system would introduce considerably less environmental interference. "Many people then thought it [was] a crazy idea, because it was very challenging already doing the sophisticated quantum-optics experiments inside a well-shielded optical table," Pan told Live Science. "So how can you do similar experiments at thousand-kilometers distance scale and with the optical elements vibrating and moving at a speed of 8 kilometers per second [5 miles per second]?" In the new study, researchers used China's Micius satellite, which was launched last year, to transmit the entangled photon pairs. The satellite features an ultrabright entangled photon source and a high-precision acquiring, pointing and tracking (APT) system that uses beacon lasers on the satellite and at three ground stations to line up the transmitter and receivers. Once the photons reached the ground stations, the scientists carried out tests and confirmed that the particles were still entangled despite having traveled between 994 miles and 1,490 miles (1,600 and 2,400 km), depending on what stage of its orbit the satellite was positioned at. Only the lowest 6 miles (10 km) of Earth's atmosphere are thick enough to cause significant interference with the photons, the scientists said. This means the overall efficiency of their link was vastly higher than previous methods for distributing entangled photons via fiber-optic cables, according to the scientists. [Twisted Physics: 7 Mind-Blowing Findings] "We have already achieved a two-photon entanglement distribution efficiency a trillion times more efficient than using the best telecommunication fibers," Pan said. "We have done something that was absolutely impossible without the satellite." Apart from carrying out experiments, one of the potential uses for this kind of system is for "quantum key distribution," in which quantum communication systems are used to share an encryption key between two parties that is impossible to intercept without alerting the users. When combined with the correct encryption algorithm, this system is uncrackable even if encrypted messages are sent over normal communication channels, experts have said. Artur Ekert, a professor of quantum physics at the University of Oxford in the United Kingdom, was the first to describe how entangled photons could be used to transmit an encryption key. "The Chinese experiment is quite a remarkable technological achievement," Ekert told Live Science. "When I proposed the entangled-based quantum key distribution back in 1991 when I was a student in Oxford, I did not expect it to be elevated to such heights!" The current satellite is not quite ready for use in practical quantum communication systems, though, according to Pan. For one, its relatively low orbit means each ground station has coverage for only about 5 minutes each day, and the wavelength of photons used means it can only operate at night, he said. Boosting coverage times and areas will mean launching new satellites with higher orbits, Pan said, but this will require bigger telescopes, more precise tracking and higher link efficiency. Daytime operation will require the use of photons in the telecommunications wavelengths, he added. But while developing future quantum communication networks will require considerable work, Thomas Jennewein, an associate professor at the University of Waterloo's Institute for Quantum Computing in Canada, said Pan's group has demonstrated one of the key building blocks. "I have worked in this line of research since 2000 and researched on similar implementations of quantum- entanglement experiments from space, and I can therefore very much attest to the boldness, dedication and skills that this Chinese group has shown," he told Live Science. Original article on Live Science.
<urn:uuid:130b82e7-3567-4310-ad10-785c5e7f5955>
CC-MAIN-2022-49
https://www.livescience.com/59502-new-quantum-entanglement-record.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711114.3/warc/CC-MAIN-20221206192947-20221206222947-00837.warc.gz
en
0.942431
1,047
3.65625
4
StJohns Field is a massive helium reservoir and immense carbon storage basin located on 152,000 acres in Apache County, Arizona. Extensive third-party geological studies performed on the property indicate reserves of up to 33 billion cubic feet of helium in shallow, easily accessible reservoirs. Capable of producing one billion cubic feet of helium per year, it will be among the most prolific helium production sites in the world. While most helium is extracted from natural gas deposits, the helium produced at St Johns is highly unusual in that it does not contain any hydrocarbons. The gas deposit is composed almost entirely of carbon dioxide, and as the helium is extracted in the production process, all of the excess CO2 will be reinjected into isolated geological formations and safely sequestered deep underground for millennia. As a result, the helium produced at St Johns is exceptionally clean and environmentally friendly, with a net zero carbon footprint. Helium is the only element on the planet that is a completely non-renewable resource. It is both scarce and finite, with no commercially viable industrial process to replicate it. Helium is formed by the natural radioactive decay process of Uranium, and can be trapped underground if a halite or anhydrite cap exists above it. If helium is not trapped in this way, it escapes to the atmosphere and rises into space. Helium is the coldest element, with a boiling point of only 4° Kelvin, and has unique superfluid properties. It has many applications as a high-tech coolant, and is a critical component for nearly all modern technology systems. For example, liquid helium is used to cool the magnets in MRI systems, helping to optimize their function. It is also used to control the temperature of silicon in the semiconductor manufacturing process. Because Helium is inert and non-flammable, it is used in space and satellite systems as a purge gas in hydrogen systems, and as a pressurizing agent for ground and flight fluid systems. Both NASA and SpaceX are major consumers of helium. Data centers use helium to encapsulate hard drives, which reduces friction and energy consumption - Google, Amazon, and Netflix are all major consumers. Quantum computing systems also use liquid helium in dilution refrigerators, providing temperatures as low as 2 mK. Inaddition to its immense helium reserves, the geological characteristics of St Johns make it an ideal storage basin for carbon dioxide. With the ability to inject 22 million metric tons of CO2 per year and a total storage capacity of over 1 billion metric tons, St Johns is set to become one of the largest carbon capture sites in the world. Strategically located in the fast-growing American Southwest near several coal-fired power plants, Proton Green is well positioned to become a critical carbon sequestration hub in the region. The exceptionally well-suited geological storage structure, with its remote location, pipeline infrastructure, right of way, and Class VI storage permits (once granted) will be significant barriers to entry for competitors. Hydrogen is steadily emerging as one of the most effective fossil fuel replacements and could become a lucrative opportunity for Proton Green as the global movement toward decarbonization and a net zero economy continues. Our processing plants are capable of producing large volumes of industrial-grade hydrogen while simultaneously sequestering the excess CO2 in underground storage basins, thereby qualifying as blue hydrogen. The hydrogen we produce can then be sold into the California markets and will be eligible for Low Carbon Fuel Standard (LCFS) credits as we help drive the transition toward a sustainable fuel and energy source. Proton Green will partner with government agencies, NGOs, research institutions, and startup companies to create a cutting-edge incubator and innovation center for emerging carbon-neutral technologies and processes like blue hydrogen, CO2-enhanced geothermal energy, biomass energy, and carbon fiber materials. The research center will be located in a designated Opportunity Zone in the extreme southwest corner of the property, and Proton Green will provide CO2 to support research and development activities. We are currently pursuing an opportunity to develop a bioenergy plant that will convert forest-wood waste into biofuel. A seasoned independent oil and gas producer since 1982, Mr. Looper has extensive experience drilling and operating wells in Colorado, Kentucky, Louisiana, New Mexico, Oklahoma, Texas and Wyoming. He also has project management in Botswana, Canada, South Africa and Zimbabwe. Since 1993, Mr. Looper has been focused on the development of large resource plays in West Texas at Riata Energy, Inc. and most recently in the Barnett Shale trend, where his capital providers achieved>100% rates of return. Mr. Looper is an alumni of West Texas State University, T. Boone Pickens School of Business and participated in the Harvard Business School, Executive Management Program 2003-2007. Mr. Coates is a highly experienced oil and gas professional with a career emphasis on large-scale, unconventional resource development. He is currently involved in Helium development, carbon capture, oil and gas, and geothermal projects. His educational background in geology, geochemistry and engineering led to an initial career with Advanced Resources International, a domestic and international technical consulting firm at the forefront of unconventional resource development and Carbon Capture technology. He subsequently joined MCN Corp (now DTE Energy) in a senior management role to successfully develop a multi TCF natural gas reserve base in the US. He also co-founded an E&P company Patrick Energy with the funding of a family office that has led to a series of privately funded ($200MM capital) E&P companies built and sold over the past twenty years.
<urn:uuid:b3c9f9f6-eb36-48d2-b9e7-81da644a0816>
CC-MAIN-2022-49
https://www.protongreen.com/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711360.27/warc/CC-MAIN-20221208183130-20221208213130-00518.warc.gz
en
0.941923
1,143
3.546875
4
Encryption technologies are used to secure many applications and websites that you use daily. For example, online banking or shopping, email applications, and secure instant messaging use encryption. Encryption technologies secure information while it is in transit (e.g. connecting to a website) and while it is at rest (e.g. stored in encrypted databases). Many up-to-date operating systems, mobile devices, and cloud services offer built-in encryption, but what is encryption? How is it used? And what should you and your organization consider when using it? What is encryption? Figure 1 - Encryption encodes (or scrambles) information Long description - Figure 1 Image shows how encryption encodes and protects the confidentiality of the information by stopping unauthorized individuals from accessing it, as they don't have the key to decrypt the message. Encryption encodes (or scrambles) information. Encryption protects the confidentiality of information by preventing unauthorized individuals from accessing it. For example, Alice wants to send Bob a message, and she wants to ensure only he can read it. To keep the information confidential and private, she encrypts the message using a secret key. Once encrypted, this message can only be read by someone who has the secret key to decode it. In this case, Bob has the secret key. Eve is intentionally trying to intercept the message and read it. However, the message is encrypted, and even if Eve gets a copy of it, she can’t read it without acquiring the secret key. If an individual accidentally receives a message that includes encrypted information, they will be unable to read the encrypted contents without the key to decrypt the message. How is encryption used? Encryption is an important part of cyber security. It is used in a variety of ways to keep data confidential and private, such as in HTTPS websites, secure messaging applications, email services, and virtual private networks. Encryption is used to protect information while it is actively moving from one location to another (i.e. in transit) from sender to receiver. For example, when you connect to your bank’s website using a laptop or a smartphone, the data that is transmitted between your device and the bank’s website is encrypted. Encryption is also used to protect information while it is at rest. For example, when information is stored in an encrypted database, it is stored in an unreadable format. Even if someone gains access to that database, there’s an additional layer of security for the stored information. Encryption is also used to protect personal information that you share with organizations. For example, when you share your personal information (e.g. birthdate, banking or credit card information) with an online retailer, you should make sure they are protecting your information with encryption by using secure browsing. Many cloud service providers offer encryption to protect your data while you are using cloud based services. These services offer the ability to keep data encrypted when uploading or downloading files, as well as storing the encrypted data to keep it protected while at rest. When properly implemented, encryption is a mechanism that you and your organization can use to keep data private. Encryption is seamlessly integrated into many applications to provide a secure user experience. How can I use encryption? Your organization likely already uses encryption for many applications, such as secure browsing and encrypted messaging applications. If you access a website with padlock icon and HTTPS in front of the web address, the communication (i.e. the data exchanged between your device and the website’s servers) with the website is encrypted. To protect your organization’s information and systems, we recommend that you use HTTPS wherever possible. To ensure that users are accessing only HTTPS-supported websites, your organization should implement the web security policy tool HTTP Strict Transport Security (HSTS). HSTS offers additional security by forcing users’ browsers to load HTTPS supported websites and ignore unsecured websites (e.g. HTTP). Encrypted messaging applications Most instant messaging applications offer a level of encryption to protect the confidentiality of your information. In some cases, messages are encrypted between your device and the cloud storage used by the messaging service provider. In other cases, the messages are encrypted from your device to the recipient’s device (i.e. end-to-end encryption). When using end-to-end encryption services, not even the messaging service provider can read your encrypted messages. In deciding which tools to use, you need to consider both the functionality of the service and the security and privacy requirements of your information and activities. For further information, refer to protect how you connect. Encryption is just one of many security controls necessary to protect the confidentiality of data. What else should I consider? Encryption is integrated into many products that are commonly used by individuals and organizations to run daily operations. When choosing a product that uses encryption, we recommend that you choose a product that is certified through the Common Criteria (CC) and the Cryptographic Module Validation Program (CMVP). The CC and the CMVP list cryptographic modules that conform to Federal Information Processing Standards. Although the CC and the CMVP are used to vet products for federal government use, we recommend that everyone uses these certified products. The CCCS recommends When choosing a suitable encryption product for your organization, consider the following: - Evaluate the sensitivity of your information (e.g. personal and proprietary data) to determine where it may be at risk and implement encryption accordingly. - Choose a vendor that uses standardized encryption algorithms (e.g. CC and CMVP supported modules). - Review your IT lifecycle management plan and budget to include software and hardware updates for your encryption products. - Update and patch your systems frequently. Prepare and plan for the quantum threat to cyber security. For more information, please see Addressing the quantum computing threat to cryptography ITSE.00.017. Encryption for highly sensitive data Systems that contain highly sensitive information (e.g. financial, medical, and government institutions) require additional security considerations. Contact us for further guidance on cryptographic solutions for high-sensitivity systems and information: firstname.lastname@example.org.
<urn:uuid:4714c619-9689-4360-82e9-4d47c31c1d5d>
CC-MAIN-2022-49
https://cyber.gc.ca/en/guidance/using-encryption-keep-your-sensitive-data-secure-itsap40016
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710417.25/warc/CC-MAIN-20221127173917-20221127203917-00042.warc.gz
en
0.913563
1,285
3.578125
4
Semiconductors are drivers of modern electronics, and they are the main enablers of our communications, computing, energy, transport, IoT systems and many more. Almost each and every device we have around us has a semiconductor in it, so no one can overestimate their importance in the world of technology. Today we’re trying to break down the notion of semiconductors, discover what’s inside this vital element and what trends are driving its development today. A semiconductor as the name implies is a material that has electrical behavior between conductors and insulation. Conductors are substances that easily transmit electricity, while insulators poorly transmit electricity. The semiconductor industry uses silicon as its primary material. Silicon is a good conductor, but it does not have the necessary characteristics to make a useful transistor. To change this, manufacturers add impurities to the silicon crystal structure. Impurities are atoms that do not belong to the regular arrangement of the crystal lattice. By adding these impurities, manufacturers can control how easily the electrons and holes move through the silicon. Silicon is the basis for all modern electronic devices. Transistor technology was first developed using germanium, a semiconductor with similar properties to silicon. Germanium is still used today, but silicon is much easier to work with. Because of this, silicon is still the dominant semiconductor material. Semiconductors are classified based on whether they are intrinsic or extrinsic. Intrinsic means that there are no impurities present in the material. Extrinsic means that the material requires doping to become conductive and therefore is considered a semiconductor. Intrinsic semiconductors have no additional doping elements added to them. These materials do not need to be externally charged before they conduct electricity. Intrinsic semiconducting materials are often referred to as bulk materials. Examples of intrinsic semiconductors are silicon (Si) and germanium (Ge). Extrinsic semiconductors are those that require doping to make them conductive. An example of an extrinsic semiconductor would be gallium arsenide, which is commonly used in transistors. Here, arsenic atoms have been added to the crystal structure of gallium to create positive charges called acceptor states. These states act as electron traps, causing the semiconductor to become electrically conductive. The IT industry cannot be separated from the development of the semiconductor industry. Semiconductors examples are transistors, MOSFETs, ICs, and diodes. One of the semiconductor materials commonly used in a digital device (logic-based circuit) technology development is a transistor. The invention of the transistor in 1947 helped in the development of second-generation computers into smaller, faster, more reliable, and more energy efficient than their predecessors. It was the era that transistors began their massive deployment which was started by Shockley until the birth of Fairchild Semiconductor which is considered as a pioneer in IC and transistor manufacturers. In the early 1960s, successful second-generation computers began to emerge in business, universities, and in government. These second-generation computers are computers that use full transistors. From here was born the next generation of computers that use hardware-based LSI, VLSI, ULSI to supercomputers. The birth of computer networking technology as well as the Internet, which is also supported by semiconductor-based devices, brought IT technology into the modern state as we know it today. Semiconductor has revolutionized electronic hardware, especially since the invention of the transistor. Semiconductors make hardware more compact and have better computing-related capabilities. The effect is that electronic components are now easier to obtain at affordable prices in the marketplace. This makes it easy for new developers to conduct research and innovation. LANARS provides hardware development services for creating new products and businesses, as well as for improving existing ones. The semiconductor, commonly known as the chipset, is the most important component. Despite their small size, semiconductor chips are the brains of an electronic system. In digital devices, the presence of semiconductors is needed to increase the speed of digital signal processing, including memory for data storage. As we are now in the industrial era 4.0, the need for semiconductor chips continues to grow. The semiconductor industry is also considered the lifeblood that is essential in accelerating digital transformation. The development of computers, the telecommunication industry, automotive equipment, especially electric vehicles (EVs), as well as digitalization in many sectors require the readiness of the semiconductor industry to prepare the required resources. In the midst of increasing demand for semiconductors, the global COVID-19 pandemic in 2020 hit almost the entire industry with a lockdown policy. This also has an impact on the supply of semiconductors, resulting in reduced supply, which has an impact on other industries. The affected industries include computers, Smart-TVs, smartphones, tablets, game consoles, and various electronic gadgets to the automotive industry. On the other hand, the COVID-19 pandemic has also increased the need for computers and gadgets in line with the school-from-home or work-from-home policies. This condition causes the semiconductor price trend to rise from the 2020 period to the present time. The implication results in 2021 the major players of semiconductor chipsets such as TSMC actually reap profits caused by the shortage of global chipset supply. According to a report from research firm TrendForce, if the top 10 chipset manufacturers combined, they will get a total revenue of US$127.4 billion in 2021. This figure is an increase of 48% compared to the previous year. As for 2022 itself, as reported by Deloitte, some observers say that semiconductor sales are expected to grow back by 10%, and could exceed US$ 600 billion for the first time in 2022. In the future, semiconductor trends will continue to be needed by various industries, although there is economic uncertainty is predicted, chipset availability is also expected to recover in 2023. Moore's Law predicts that the number of transistors in integrated circuits (IC) will double every year, is used as a reference by the semiconductor industry to set their research and development targets. This is evidenced by the birth of microprocessor capabilities that are increasing every year. But even Moore's law will eventually meet an impenetrable limit, increasing computer performance by adding transistors has so far been done by reducing the size of the transistor so that it can fit more in the same area. A few years ago, physicist Michio Kaku noted that there was a point where the silicon material used to make the transistor — or any substitute for it — could not be reduced any further. Several studies have initiated the use of other materials for the development of semiconductors. Third-generation semiconductor materials, such as gallium nitride (GaN) and silicon carbide (SiC), promise high-temperature resistance, high breakdown voltage, high frequency, high power, and high radiation resistance. However, for a long time, the use of these materials was limited to a narrow range of fields due to their complex processing methods and high cost. In recent years, breakthroughs in material growth and device fabrication have helped reduce the cost of third-generation semiconductor materials, enabling a wider range of applications. For example, SiC-based devices used for car inverters and GaN-based fast chargers appeared on the market. Semiconductor technology trends that have also been widely discussed to improve chip capabilities include parallel computing, quantum computing, to protein computers that work with DNA. Semiconductor is a material that has electrical properties between conductors and insulators. Semiconductors bring drastic changes in the technological development of mankind. From Shockley and Fairchild who make transistors to large manufacturers of chipset makers to giants like Intel that use semiconductors to create technology that plays a very important role in the development of computers, gadgets, household appliances, automation, telecommunications, and so on. The technological trend proclaimed by Moore’s Law has already occurred, and it is predicted that the number of transistor densities in a wafer will also be achieved. Therefore, there are various developments carried out to maximize semiconductors such as the use of third-generation materials, quantum computing, etc. semiconductor trends will continue to be needed by various industries, although economic uncertainty is predicted, chipset or semiconductors availability is also expected to recover in 2023.
<urn:uuid:444e5879-7d9b-4bf8-a75e-d6b09896dd50>
CC-MAIN-2022-49
https://lanars.com/blog/intro-to-semiconductors-hot-industry-trends-2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711003.56/warc/CC-MAIN-20221205032447-20221205062447-00402.warc.gz
en
0.949371
1,756
3.8125
4
- Advances in quantum computing could help us simulate large complex molecules. - These simulations could uncover new catalysts for carbon capture that are cheaper and more efficient than current models. - We can currently simulate small molecules up to a few dozen qubits but need to scale this to the order of 1 million. Imagine being able to cheaply and easily “suck” carbon directly out of our atmosphere. Such a capability would be hugely powerful in the fight against climate change and advance us towards the ambitious global climate goals set. Surely that’s science fiction? Well, maybe not. Quantum computing may be just the tool we need to design such a clean, safe and easy-to-deploy innovation. In 1995 I first learned that quantum computing might bring about a revolution akin to the agricultural, industrial and digital ones we’ve already had. Back then it seemed far-fetched that quantum mechanics could be harnessed to such momentous effect; given recent events, it seems much, much more likely. Much excitement followed Google’s recent announcement of quantum supremacy: “[T]he point where quantum computers can do things that classical computers can’t, regardless of whether those tasks are useful”. The question now is whether we can develop the large-scale, error-corrected quantum computers that are required to realize profoundly useful applications. The good news is we already concretely know how to use such fully-fledged quantum computers for many important tasks across science and technology. One such task is the simulation of molecules to determine their properties, interactions, and reactions with other molecules – a.k.a. chemistry – the very essence of the material world we live in. While simulating molecules may seem like an esoteric pastime for scientists, it does, in fact, underpin almost every aspect of the world and our activity in it. Understanding their properties unlocks powerful new pharmaceuticals, batteries, clean-energy devices and even innovations for carbon capture. To date, we haven’t found a way to simulate large complex molecules – with conventional computers, we never will, because the problem is one that grows exponentially with the size or complexity of the molecules being simulated. Crudely speaking, if simulating a molecule with 10 atoms takes a minute, a molecule with 11 takes two minutes, one with 12 atoms takes four minutes and so on. This exponential scaling quickly renders a traditional computer useless: simulating a molecule with just 70 atoms would take longer than the lifetime of the universe (13 billion years). This is infuriating, not just because we can’t simulate existing important molecules that we find (and use) in nature – including within our own body – and thereby understand their behaviour; but also because there is an infinite number of new molecules that we could design for new applications. That’s where quantum computers could come to our rescue, thanks to the late, great physicist Richard Feynman. Back in 1981, he recognized that quantum computers could do that which would be impossible for classical computers when it comes to simulating molecules. Thanks to recent work by Microsoft and others we now have concrete recipes for performing these simulations. A quantum catalyst to tackling climate change? One area of urgent practical importance where quantum simulation could be hugely valuable is in meeting the SDGs – not only in health, energy, industry, innovation and infrastructure but also in climate action. Examples include room-temperature superconductors (that could reduce the 10% of energy production lost in transmission), more efficient processes to produce nitrogen-based fertilizers that feed the world’s population and new, far more efficient batteries. One very powerful application of molecular simulation is in the design of new catalysts that speed up chemical reactions. It is estimated that 90% of all commercially produced chemical products involve catalysts (in living systems, they’re called enzymes). A catalyst for “scrubbing” carbon dioxide directly from the atmosphere could be a powerful tool in tackling climate change. Although CO2 is captured naturally, by oceans and trees, CO2 production has exceeded these natural capture rates for many decades. The best way to tackle CO2 is not releasing more CO2; the next best thing is capturing it. “While we can’t literally turn back time, [it] is a bit like rewinding the emissions clock,” according to Torben Daeneke at RMIT University. There are known catalysts for carbon capture but most contain expensive precious metals or are difficult or expensive to produce and/or deploy. “We currently don’t know many cheap and readily available catalysts for CO2 reduction,” says Ulf-Peter Apfel of Ruhr-University Bochum. Given the infinite number of candidate molecules that are available, we are right to be optimistic that there is a catalyst (or indeed many) to be found that will do the job cheaply and easily. Finding such a catalyst, however, is a daunting task without the ability to simulate the properties of candidate molecules. And that’s where quantum computing could help. We might even find a cheap catalyst that enables efficient carbon dioxide recycling and produces useful by-products like hydrogen (a fuel) or carbon monoxide (a common source material in the chemical industry). Quantum computing to the rescue – what will it take? We can currently simulate small molecules on prototype quantum computers with up to a few dozen qubits (the quantum equivalent of classical computer bits). But scaling this to useful tasks, like discovering new CO2 catalysts, will require error correction and simulation to the order of 1 million qubits. It’s a challenge I have long believed will only be met on any human timescale – certainly by the 2030 target for the SDGs – if we use the existing manufacturing capability of the silicon chip industry. The path forward At a meeting of the World Economic Forum’s Global Future Councils last month a team of experts from across industry, academia and beyond assembled to discuss how quantum computing can help address global challenges, as highlighted by the SDGs, and climate in particular. As co-chair of the Global Future Council on Quantum Computing, I was excited that we were unanimous in agreeing that the world should devote more resources, including in education, to developing the powerful quantum computing capability that could help tackle climate change, meet the SDGs more widely and much more. We enthusiastically called for more international cooperation to develop this important technology on the 2030 timescale to have an impact on delivering the SDGs, in particular climate. So the real question for me is: can we do it in time? Will we make sufficiently powerful quantum computers on that timeframe? I believe so. There are, of course, many other things we can and should do to tackle climate change, but developing large-scale, error-corrected quantum computers is a hedge we cannot afford to go without. This article is republished from the World Economic Forum.
<urn:uuid:e96ab514-5f64-439b-a019-4f2f61e07626>
CC-MAIN-2022-49
https://liwaiwai.com/2019/12/30/how-quantum-computing-could-beat-climate-change/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710870.69/warc/CC-MAIN-20221201221914-20221202011914-00642.warc.gz
en
0.927263
1,449
3.6875
4
The drive to solve problems faster and more efficiently is never going to stop, and this has led to the enhancements in existing technologies as well as invention several new ones. This hunger, combined with the competitive spirit of scientific research, has led humankind to a new era, the era of Quantum computing. Quantum computers and quantum computing are technical and complicated as they sound. Quantum computers have been in development for an extended period but have never found practical usage. Scientists believe these technological marvels to be significantly faster than your conventional desktop or even the existing supercomputers. But how do quantum computers work? Read ahead to know all about quantum computing. How do Quantum Computers work? Quantum Computers work by performing calculations on the probability of an object’s state before it is even measured. Classic computers work on performing operations on 0 or 1 — the binary states. These 0 or 1 are the definite positions of the physical states. quantum computers can calculate much more data than classic computers using this probability of the state of an object. Just like modern computing required bits to process data, Quantum computing requires qubits to process and analyse data. Qubit is the quantum state of the object, which is the undefined property of the object before it is detected. These properties include the spin of electrons or the states of a coin when tossed or the polarisation of the photon. These quantum states of the object can look random but are inter-related or entangled. The superpositions are mathematically relatable to the result, and by putting the quantum states into unique algorithms, we can make advancements in the fields never touched before. Quantum computers can help solve complex mathematical equations, improve machine learning techniques, producing better security codes and even tackle more complex scenarios. Because of the potential of processing data at a very high speed and ability to solve complex equations, there are different tech giants such as D-Wave Systems, IBM and Google, that are claiming to be very close to achieving quantum supremacy. Quantum supremacy is showcasing that a programmable quantum device can solve a problem that classical computers practically cannot within a viable time. The Race for Quantum Computing D-Wave Systems is one of the leading Quantum computer manufacturers, and have been producing, selling and setting up quantum computers at various organisation worldwide such as the University of Southern California, Google, NASA and Los Alamos National Lab. D-Wave has already produced a 2048 qubit quantum computer and has announced a much bigger quantum computer. The company has announced its fifth generation, a 5000-qubit quantum computer that will release in mid-2020. D-Wave has named it Advantage, which uses the company’s latest Pegasus topology that provides better and higher connectivity. This helps in solving more complex problems than before. In the same year, on October 23, 2019, Google announced that they had achieved Quantum Supremacy. The company said that they have successfully solved a problem that would take a considerable amount of time, even on the most powerful supercomputer available today. Using a quantum computer named Sycamore, researchers at Google performed random circuit sampling. Random circuit sampling is a sequence of random operation done on qubits. After performing all operations multiple times, they measured the values of the qubits. The researchers received a number distribution close to random but were still interrelated because of quantum effects. Performing all these operations on the most powerful computing platform available will take around 10,000 years, while Sycamore took 200 seconds to complete the operation and all calculations according to the team. “With the first quantum computation that cannot reasonably be emulated on a classical computer, we have opened up a new realm of computing to be explored”, wrote Google researchers John Martinis and Sergio Boixo in a Google AI blog. But does this stop here? Even before Google announced quantum supremacy, IBM published a report on October 21, 2019, in which the tech giant claimed that the calculations by 53 and 54 qubits Symacore circuits can be done using the classic algorithms and within a couple of days. IBM has also been working on its quantum computer, which has now been available on the cloud by IBM. The company has named it IBM Q System One, and organisations can pay and reserve their time on the machine. Major businesses and companies such as Goldman Sachs, Samsung, JPMorgan Chase & Co. among other big-wigs, are investing their time and wealth in System One to see how quantum computing can be used in real-life scenarios. IBM has been developing and increasing the number of qubits in IBM Q since May 2016, when it was first launched. There has been a lot of development in this field, but we still haven’t reached the stage where we can put this technology into daily-life use. There are a lot of areas in which your laptop is much powerful and efficient than quantum computers. Even with the continuous developments and advancements, practical quantum computers are a thing of the future. It will take at least a decade — if not more — for them to replace the computers we are using. To fit enough number of qubits that can solve any problem thrown at it will take years in development. But if we develop a practical quantum computer, it can track down any information available, decode all the security measures of any platform, mine cryptocurrency with no hassle, and search for a piece of information in a million database within seconds. The possibilities are endless and might even be beyond our imaginations, but the technology needs to evolve, and only time will tell what it has to offer. Also read: TPU vs GPU vs CPU A BTech student whose interest lies in automobiles, tech, music, coding and badminton.
<urn:uuid:c6fd787b-1224-432a-b995-d17eca858639>
CC-MAIN-2022-49
https://candid.technology/what-is-quantum-computing-how-quantum-computers-work/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710473.38/warc/CC-MAIN-20221128034307-20221128064307-00401.warc.gz
en
0.948136
1,180
3.71875
4
Nanoscale discovery could help cool overheating in electronics A team of physicists at CU Boulder has solved the mystery behind a puzzling phenomenon in the nano realm: why some ultra-small heat sources cool faster if you move them close together. The results, published today in the journal Proceedings of the National Academy of Sciences (PNAS), could one day help the tech industry to design faster electronic devices that overheat less. “Often times heat is a difficult consideration in electronics design. You build a device and then find it heats up faster than you want,” said study co-author Joshua Knobloch, postdoctoral research associate. at JILA, a joint research institute between CU Boulder and the National Institute of Standards and Technology (NIST). “Our goal is to understand the fundamental physics involved so that we can design future devices to effectively manage heat flow.” The research began with an unexplained observation: In 2015, researchers led by physicists Margaret Murnane and Henry Kapteyn at JILA were experimenting with metal bars several times thinner than the width of a human hair on a silicon base. When they heated these bars with a laser, something strange happened. “They behaved in a very counterintuitive manner,” Knobloch said. “These nanoscale heat sources don’t usually dissipate heat efficiently. But if you pack them together, they cool much faster.” Now researchers know why this is happening. In the new study, they used computer simulations to track the passage of heat from their nanoscale bars. They found that when they brought the heat sources closer together, the energy vibrations they produced began to bounce off each other, dispersing the heat and cooling the bars. The group’s findings highlight a major challenge in designing the next generation of tiny devices, such as microprocessors or quantum computing chips: when you scale yourself down to very small scales, heat doesn’t always behave like you think so. Atom by atom Heat transmission in devices is important, the researchers added. Even tiny flaws in the design of electronics like computer chips can allow temperature to build up, increasing wear and tear on a device. As tech companies strive to produce ever smaller electronic devices, they will need to pay more attention than ever to phonons, vibrations of atoms that carry heat in solids. “The heat flow involves very complex processes, which makes it difficult to control,” Knobloch said. “But if we can understand how phonons behave on a small scale, then we can tailor their transport, which allows us to build more efficient devices.” To do this, Murnane and Kapteyn and their team of experimental physicists joined forces with a group of theorists led by Mahmoud Hussein, professor in the Ann and HJ Smead department of aerospace engineering sciences. His group specializes in the simulation or modeling of the movement of phonons. “On an atomic scale, the very nature of heat transfer is emerging in a new light,” said Hussein, who also has a courtesy appointment in the physics department. The researchers essentially recreated their experiment from several years ago, but this time, entirely on a computer. They modeled a series of silicon bars, laid side by side like the slats of a railroad track and heated them. The simulations were so detailed, Knobloch said, that the team was able to track the behavior of every atom in the model, millions in all, from start to finish. “We were really pushing the memory limits of the Summit supercomputer at CU Boulder,” he said. Direct the heat The technique paid off. The researchers found, for example, that when they spread their silicon bars far enough apart, heat tended to escape from these materials in a predictable way. Energy leaked out of the bars and into the material below, dissipating in all directions. However, when the bars got closer, something else happened. As the heat from these sources dispersed, it effectively forced that energy to flow more intensely away from the sources, like a crowd of people in a stadium jostling against each other and leaping by. the exit. The team called this phenomenon “directional heat channeling”. “This phenomenon increases heat transport down into the substrate and away from heat sources,” Knobloch said. Researchers suspect that engineers may one day exploit this unusual behavior to better understand how heat flows through small electronic devices, directing that energy along a desired path, instead of letting it run freely and freely. For now, researchers see the latest study as what scientists from different disciplines can do when working together. “This project was an exciting collaboration between science and engineering, where the advanced methods of computational analysis developed by Mahmoud’s group were essential to understanding the behavior of new materials discovered earlier by our group using new extreme ultraviolet quantum light sources, “said Murnane, also a professor of physics. CU Boulder’s other co-authors on the new research include Hossein Honarvar, postdoctoral researcher in aerospace engineering sciences and JILA and Brendan McBennett, graduate student at JILA. Former JILA researchers Travis Frazer, Begoña Abad and Jorge Hernandez-Charpak also contributed to the study. New heat management material keeps computers cool Directional thermal channeling: Phenomenon triggered by tight compression of heat sources, Proceedings of the National Academy of Sciences (2021). DOI: 10.1073 / pnas.2109056118 Provided by the University of Colorado at Boulder Quote: Nanoscale Discovery Could Help Cool Overheating in Electronics (2021, September 20) Retrieved September 20, 2021 from https://phys.org/news/2021-09-nano-scale-discovery- cool-overheating-electronics. html This document is subject to copyright. Other than fair use for private study or research purposes, no part may be reproduced without written permission. The content is provided for information only.
<urn:uuid:18708cf5-c1c8-4e0f-afe4-1447a17d07dc>
CC-MAIN-2022-49
https://yoursolarpowerhome.com/nanoscale-discovery-could-help-cool-overheating-in-electronics/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446709929.63/warc/CC-MAIN-20221126212945-20221127002945-00323.warc.gz
en
0.938858
1,257
3.5
4
Sodium is a chemical element with symbol Na (from Ancient Greek Νάτριο) and atomic number 11. It is a soft, silver-white, highly reactive metal. In the Periodic In this video we'll look at the atomic structure, valence electrons, Given: Density = 0.97 g/cm 3, Molar mass (M) = 23 g/mol To find: Radius of sodium atom (r) Formula: 1. Density (ρ) = `"M n"/("a"^3 "N"_"A")` 2. For bcc unit cell, r = `(sqrt3 "a")/4` Calculation: For a bcc lattice, number of atoms per unit cell is 2. ∴ n = 2. From formula (i), 2011-10-11 Down to the atom: Through different imaging methods, electron microscopy can provide direct observation of oxygen atoms and sodium cations, pointing … Sodium at standard temperature and pressure is a soft silvery metal that combines with oxygen in the air and forms grayish white sodium oxide unless immersed in oil or inert gas, which are the conditions it is usually stored in. Sodium metal can be easily cut with a knife and is a good conductor of electricity and heat because it has only one electron in its valence shell, resulting in weak Sodium Atom Sodium atoms are ionized mostly by charge transfer with the ambient NO+ and O2+ ions, with a small contribution from solar photoionization. From: Encyclopedia of Atmospheric Sciences , 2003 Atomic Mass of Sodium Atomic mass of Sodium is 22.9897 u. natrium) — химический элемент первой группы, третьего периода Показывать компактно. ↑ Atomic weights of the elements 2013 ( IUPAC Technical Report) (англ.) — IUPAC, 1960. — ISSN 0033-4545; 1365- 3075; Chemical element, symbol: Na, atomic number: 11 and atomic weight 22,9898. It's a soft metal, reactive and with a low melting point, with a relative density of 0 Sodium. Symbol, Na, Atomic number, 11. Atomic mass 2 Mar 2020 Answer: Atomic structure of a sodium ion : Explanation : Sodium atom have 11 electrons ,thus we have to draw 3 rings around the word"Na" The Kossel shell structure of sodium. Atomic spectrum. 5 Sep 2020 Most atoms do not have eight electrons in their valence electron shell. As demonstrated here, a sodium atom (Na) has one valence electron Fig. 35. EDX map and point analysis for alloy RC AM50 exposed in the presence of 400 ppm CO2 and 70 µg/cm² Natriumklorid, NaCl: vanligt koksalt, består av jonerna Na+ och Cl-. Natriumklorid förekommer rikligt i naturen. Det utvinns ur saltgruvor eller genom avdunstning Thus, sodium ion (Na+) has eight valence electrons. The octet rule is a result of trends in energies and is useful in explaining why atoms form the ions that they do. Now consider an Na atom in the presence of a Cl an equal number of protons and electrons. In sodium ion, there are 1 1 protons but 1 0 electrons. In writing the electron configuration for sodium the first two electrons will go in the 1s orbital. Since 1s can only hold two electrons the next 2 electrons for sodium go in the 2s orbital. Joakim berglund linköping To balance this charge (this is a NEUTRAL metal atom!) there must be 11 electrons, 11 negatively charged particles circling the nucleus. 3) What type of bonding does water show? 4) What type of bonding is this? Fulleren Qubit kol nanorör Molekyl Atom, Sodium Atom 24, atom, Bloch sfär png Buckminsterfullerene Molecule Atom Science, kemi, allotropy, atom png As Book 2 ends, we discover that the elements are incredibly sad with tears running down their dear little atom faces. Sodium's investigation to find a way to Standardize 0.1 mol/L sodium hydroxide NaOH titrant with KHP using KHP has one acidic hydrogen atom, and reacts with NaOH on a 1:1 stoichiometric basis. Cityakuten hötorget röntgen saf lo avtalspension portugisisk musik youtube znok design tyg ragunda församling hammarstrand Find sodium atom stock images in HD and millions of other royalty-free stock photos, illustrations and vectors in the Shutterstock collection. Thousands of new, high-quality pictures added every day. Moles = Number of sodium atoms/ Avogadro's number Sodium atoms = 1.56 x … 2016-02-20 2000-01-01 The Sodium Zeeman Effect The sodium spectrum is dominated by the bright doublet known as the Sodium D-lines at 588.9950 and 589.5924 nanometers. From the energy level diagram it can be seen that these lines are emitted in a transition from the 3p to the 3s levels. Grönare gräs på andra sidan - Avskrivning bil pr år - Felaktig marknadsforing - Professor emerita meaning - Fjärrkontroll med mottagare 12v - Se shl matcher - Vaxnasgatan 10 karlstad - Vägarbete malmö 2021 When we write the configuration we'll put all 11 electrons in orbitals around the nucleus of the Sodium atom. In writing the electron configuration for sodium the first two electrons will go in the 1s orbital. Since 1s can only hold two electrons the next 2 electrons for sodium go in the 2s orbital. The nex six electrons will go in the 2p orbital. => 6.022 * 10^23 atoms weigh 23 grams => 1 atom weighs 23/(Avogadro number) grams => 3.8 In covalent bonds, two atoms share pairs of electrons, while in ionic bonds, In the diagram above, we see a neutral atom of sodium, Na, losing an electron. 5 Sep 2020 Most atoms do not have eight electrons in their valence electron shell. As demonstrated here, a sodium atom (Na) has one valence electron Download 606 Sodium Atom Stock Illustrations, Vectors & Clipart for FREE or amazingly low rates! New users enjoy 60% OFF. 158903279 stock photos online. Om du vill flytta en atom, kan du dra i den med verktyget med pilar i ändarna. Om du vill sodium chloride. Välj sedan Chrystallography Open Database, vid Sodium (Na), chemical element of the alkali metal group (Group 1 [Ia]) of the periodic table. Sodium is a very soft silvery-white metal. Sodium is the most common alkali metal and the sixth most abundant element on Earth, comprising 2.8 percent of Earth’s crust. Name: Sodium Symbol: Na Atomic Number: 11 Atomic Mass: 22.98977 amu Melting Point: 97.72 °C (370.87 K, 207.9 °F) Boiling Point: 883 °C (1156 K, 1621 °F) Number of Protons/Electrons: 11 Number of Neutrons: 12 Classification: Alkali Metal Crystal Structure: Cubic Density @ 293 K: 0.971 g/cm 3 Color: silvery Atomic Structure Sodium is an atom that has 11 protons and 12 neutrons in its nucleus and 11 electrons circling around its nucleus. Like other light atoms such as carbon, sodium forms inside of stars that are beginning to run out of fuel, and it scatters all over space when that star explodes in a supernova. Sodium is soft, and you can cut it with a knife. To give an idea of how large this number is, 1 mole of pennies would be enough money to pay all the expenses of each country on earth for about the next billion years. Se hela listan på periodic-table.com First calculate the moles of Na. 1 mole atoms = 6.022×10^23 atoms Use dimensional analysis to convert atoms to moles. 9.76*10^12 atoms Na x (1 mol Na/6.022*10^23 atoms Na) = 1.621*10^-11 mol Na Calculate mass in grams of Na by multiplying mole Na An atom of sodium-23 (Na-23) has a net charge of + 1. identify the number of protons, neutrons, and electrons in the atom. How did you determine the number of each type.
<urn:uuid:65e314fc-9042-473f-a9c5-2ed2b6d997ac>
CC-MAIN-2022-49
https://forsaljningavaktierxedb.firebaseapp.com/25190/45914.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710237.57/warc/CC-MAIN-20221127105736-20221127135736-00803.warc.gz
en
0.713189
1,953
3.9375
4
When looking back into the deep past of the Universe, which means looking out over vast cosmological distances of space, there are observed a peculiar set of galaxies emitting a tremendous amount of energy. These early galaxies, known variously as quasars, blazars, radio galaxies and radio-loud quasars, are all bodies classified as active galactic nuclei. These objects are some of the most energetic phenomena in the universe, if the name blazar was not at all evident of this fact. Active galactic nuclei represent a confirmation of physicist Nassim Haramein’s prediction that black holes are the spacetime structure that forms the seed around which galaxies and stars form. Indeed, it is now widely understood that the early formation of galaxies, producing active galactic nuclei, are in fact due to the action of supermassive black holes – black holes in upwards of a million to a billion solar masses. The super-anatomy of these central galactic black holes are as intriguing as the enigmatic beacons they form in the deep field of space. Although all major galaxies probably have a supermassive black hole at the central region, as this is the structure that initiates galaxy formation in the first place, active galactic nuclei are thought to represent a different early phase of this process when the super massive black holes were extremely active, emitting large amounts of energy (and probably matter as well); forming the first galaxies. Additionally, as a consequence of accreting the pre-galactic material, massive amounts of matter were being both gravitated into the central black holes as well as emitted from their poles. The inflowing matter forms an ultra-hot accretion disc around the equatorial region of the black hole, and relativistic jets (charged particles, or electron-positron plasma, moving at relativistic speeds) stream along the axis of rotation and can extend up to hundreds of thousands of light years. "The implied alignment of the spin axes of massive black holes that give rise [to] the radio jets suggest the presence of large-scale spatial coherence in angular momentum” – A. Taylor & P. Jagannathan These extremely energetic and massive structures are readily identified when viewing deep space images collected in the radio wave band of the electromagnetic spectrum. The scale of observation is grand: gathering light from numerous galaxies across several million parsecs of space. Equally, the instrumentation used to gather light from such distant and vast sources is colossal. Think of the Arecibo Radio Telescope, featured in such films as Contact, to get an idea of how massive these telescopes can be. One such telescope under proposal is the Square Kilometer Array, which will be one of the largest scientific observational instruments ever constructed, as the “lens” of the telescope is essentially a square kilometer in area. This telescope, when completed, will be contributory in determining fundamental cosmological parameters and probing the earliest epochs of galaxy formation. In a recent study using the Giant Meterwave Radio Telescope, South African astronomers made a remarkable discovery when analyzing the alignment of the spin axis of 64 galaxies. The orientation of the axis of rotation of active galactic nuclei are directly observable because of the long plasma jets streaming from the poles of the central supermassive black hole, with strong electromagnetic emissions in the radio frequency range. Reported in the Monthly Notices of the Royal Astronomical Society, the astrophysicist team analyzed the orientation of the radio jet position angles and found that a surprisingly large number of supermassive black holes were aligned with their axes of spin. Statistical analysis revealed that there was a 0.1% probability of such an alignment occurring by chance – strongly indicating that there is some as yet unseen force that is producing strong coherence among cosmological-scale objects. Moreover, this may imply that conditions during the earliest epochs of galactic formation deviate from complete isotropy, referring to the uniformity of the distribution of matter. It has long been presumed that the universe is homogeneous and isotropic (the same in all locations) with no identifiable axis or orientation. Indeed, this is known as the cosmological principle. Yet, one of the 20th century’s greatest minds, Kurt Gödel, provided an exact solution of the Einstein field equations that described a rotating universe. In commentary about Gödel’s work, physicist Stephen Hawking said: "These models could well be a reasonable description of the universe that we observe, however observational data are compatible only with a very low rate of rotation. The quality of these observations improved continually up until Gödel's death, and he would always ask "is the universe rotating yet?" and be told "no, it isn't." In more recent events, there have been several findings that suggest that the universe is indeed not entirely homogenous and isotropic. Such examples come from the so called axis of evil identified during an analysis of the microwave background radiation, dark flow, Shamir’s report on the Sloan Digital Sky Survey showing that left-twisted galaxies were much more common than right-swirling galaxies; as well as structural mapping such as the BOSS Great Wall and Laniakea. While the strong correlation of spin alignment of multiple super massive black holes across cosmological distances may seem puzzling -- since under standard presumptions there should be very little to no interaction of galactic nuclei across such vast distances -- Haramein has long described the dynamics and properties of spacetime that would naturally produce such correlated orientation and entanglement of objects that has been observed in this latest study. Haramein has explained the structural and geometric properties of space and matter from the smallest to the largest scale, and it is in consideration of the largest scale structure, the universe itself, that we gleam an understanding of how and why these vast arrays of galaxies are uniformly aligned in their axis of rotation. Namely, just as we have seen from indications of the “axis of evil”, “dark flow”, the great wall and great voids, the universe is not isotropic, but instead has a definite orientation. Haramein has identified this large-scale structure as a double-toroidal counter-rotating geometry. Thus, not only are phenomena like “dark flow” and seeming accelerated expansion of space observed, but just as in the most recent discovery: strong alignment of galaxies as well. The reason for this – the uniform spin of the universe, which has a strong correlating (entangling) effect on the objects that are uniformly effected by the Coriolis forces of the spinning structure. Spin dynamics naturally produce strong coherence. From this profound theory, we see that spin is not the result of matter accretion in the early universe, but instead it is the intrinsic spin and high curvature of spacetime that engenders gravitational accretion of matter into the structures that are observed. Since spin “came first”, we would expect there to be a remarkably high degree of correlation of the spin axes of primordial active galactic nuclei.In the paper The Origin of Spin: A Consideration of Torque and Coriolis Forces in Einstein’s Field Equations and Grand Unification Theory, Haramein and Elizabeth Rauscher evaluate the inclusion of torque and Coriolis effects in Einstein’s field equations of spacetime geometry (gravity). The main result of such a consideration is that spin is an intrinsic characteristic of spacetime itself, explaining galactic formations, polar jets, accretion discs, spiral arms, and galactic halos without the need for exotic forms of dark matter constructs. Remarkably, this is an instrumental facet of a Grand Unification Theory as the torque and Coriolis effects of spacetime produce the bodies and particle interactions that are observed at the atomic and hadron scale. With further consideration, could it be possible that there are additional forces that would allow for the preservation of such strong alignment over time? For instance, it is possible that galactic magnetic field interactions, which have been observed at cosmological scales, could be at play in stabilizing the strong alignment of the polar radio jets of the supermassive black holes and maintaining the anisotropy over long periods of time. Indeed, instrument such as the Square Kilometer Radio Telescope will allow for the study and analysis of galactic magnetic field interactions to see to what degree this can be involved in large-scale galactic interactions. There is another important interaction however that may be involved in the strong correlation of the spin axes observed in the supermassive black holes, and like the intrinsic spin of spacetime described by Haramein, it is another intriguing spacetime geometrical object. Known technically as Einstein Rosen Bridges (ER bridges) for the two physicist who first described their properties through maximally extended Schwarzschild solutions of Einstein’s field equations, we know them more colloquially as wormholes. Haramein has long described how the black holes that form the hearts of stellar and galactic objects are connected in a vast spacetime wormhole network. Meaning that black holes will be entangled across vast spatial and temporal distances, much like what has been observed in the radio jet spin alignment. Interestingly, more recent advances in Unified Physics have equated spacetime wormholes with the phenomenon of quantum entanglement. This is summarized by the statement that Einstein Rosen bridges produce Einstein Rosen Correlations, expressed concisely as ER = EPR. This means that not only does spacetime geometry entangle astronomical-scale black holes, but miniature ones as well (what are referred to as fundamental particles). What we are observing in this latest study may very well be quantum entanglement at a cosmological scale, as a result of the fluid dynamics of spacetime, linking together the connected universe. Resources and More to Explore
<urn:uuid:2aa7f2c4-2125-4d91-b364-f762493c08c5>
CC-MAIN-2022-49
https://www.resonancescience.org/blog/The-Rotating-Universe
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710869.86/warc/CC-MAIN-20221201185801-20221201215801-00525.warc.gz
en
0.930647
2,010
3.953125
4
How do you stop light in midflight and hold on to it – even for a fraction of a second? This ability could be crucial to such future quantum optical systems as secure communications or new kinds of information technologies. A group led by Dr. Ofer Firstenberg at the Weizmann Institute of Science recently demonstrated a method in which individual particles of light – photons – are trapped and released on demand in way that might, in the future, be used as memory for quantum information. A description of their quantum optical memory was recently published in Science Advances. Photons can carry information in the same way that electrons do, explains Firstenberg, who is in the Institute’s Physics of Complex Systems Department. In addition, they can travel long distances, for example in optical fibers, without losing that information; so in future quantum memory and information technologies, photon-based systems may be better than electronic ones for certain kinds of communication and remote sensing. Like electronic systems, photon-based systems need to package and synchronize multiple bits of information. To create such “photon packages,” the timing of the photons must be controlled. Existing devices – photon sources – are able to shoot single photons, but they do so randomly. There is no way to predict exactly when the photon will escape the source or how much time will elapse until the next one is freed. One way to deal with this lack of control is to find a way of capturing the photons, holding them in one place and releasing them on demand – that is, temporarily storing particles of light. Although Firstenberg and his group are not the first to store photons, they are the first to do so in a way that works at room temperature and is relatively fast, very efficient and noiseless (with no distortion in the information). They called their system FLAME, for Fast Ladder Memory. It consists of laser sources and a small amount of pure atomic gas – in this case, of the element rubidium. The electrons of the rubidium atoms act as the “photon memory,” and strong laser pulses are used for the writing and reading processes. The flying photons are first stored in electrons that have been excited – that is, the electrons’ orbit around the nuclei moves out a notch. Then some tens of nanoseconds later – long enough to synchronize the output from many fast photon sources – the memory is read, returning the electrons to their normal ground state and the photons to their flight. FLAME, explain the scientists, is considered to be almost completely free of noise – unwanted disturbances that often plague such systems — because what goes in is what comes out. “The photons that are released from the electrons are identical to those we put in – with the exact same properties and propagation direction. So something like one in 10,000 might be a photon we did not put there. As a quantum memory, the system is fantastic,” says PhD student Ran Finkelstein, who led this study together with Dr. Eilon Poem in Firstenberg’s lab. These findings were published in Science Advances, together with the results of similar experiments conducted at Oxford University, UK. Today, the experimental setup takes up a large table – mostly covered in lasers, mirrors and lenses, but the actual trapping takes place in a container the size of a thumb. Eventually, the scientists hope to miniaturize the process: An atomic gas containing billions of atoms can be contained in a sealed space of one cubic millimeter, and since the atoms return to their original state, it can be reused almost indefinitely. “We need only three elements – a photon source, a contained gas cloud and a strong laser,” says Finkelstein. “This is not a delicate system that works only in ultrahigh vacuum or at very low temperatures. Eventually we’ll be able to insert a system like this in something the size of a cell phone.” Farther in the future, the idea of using photons to convey information in such processes as quantum computing, communications or sensing could involve one of the stranger aspects of quantum physics – a phenomenon known as entanglement. Famously called “spooky action at a distance,” when two particles are entangled, a change to one results in an instantaneous change in the other – meaning information is somehow shared non-locally (that is, there is no way information could have been passed from one to the other by standard means). “If the trapped photons were first entangled with other photons some distance away, this would be quantum communication in the true sense of the word – really based on principles of quantum mechanics that we can’t observe in the everyday world,” says Poem. Quantum communication, if it could be developed, would be almost impossible to tamper with, and thus researchers believe it could be especially useful for new kinds of encryption. Eventually we’ll be able to insert a system like this in something the size of a cell phone Firstenberg and his group plan to test entangled photons in the FLAME setup, and they have other ideas as well, for new experiments with their quantum optical system. For example, they intend to create more complex components, such as logic gates for the information carried by the stored photons. “While we still don’t know which future quantum information systems will prevail,” says Firstenberg, “there are some things for which we know photons are best. For example, the recent discovery of gravitational waves in a distant galaxy relied on powerful optical sensors. Our communications are already sent by light waves through thin optic fibers; photon ‘quantum bits’ can travel in similar fibers. So quantum memory systems based on single photons may have applications in the not-too-distant future.” True quantum information processing with photons may be in the distant future, but the current research in Firstenberg’s lab in developing efficient and noiseless optical quantum memory is bringing that future closer. Photons move at, well, the speed of light, and they are easily destroyed. Each and every one of us constantly destroys photons as our eyes take in and absorb light. And they are extremely faint, so it takes a very fine “net” to catch them, even for a fraction of a second. So why do scientists search for ways of trapping and using single photons? Photons can be varied and manipulated in ways that electrons cannot, and if they are left undisturbed they can travel through transparent materials or vacuum practically forever without losing their strength. Since single photons obey the laws of quantum mechanics, researchers hope to find ways of applying some of that “quantum weirdness” to create new types of computation, memory and especially communications. Several ideas for using photons to secure communications have been suggested. If a single photon were used as a “key,” for example, anyone trying to intercede in the transmission would destroy that key. Similarly, if photons at either end were entangled, a change in the photon at the receiving end would alert the recipient that tampering had occurred. Dr. Ofer Firstenberg’s research is supported by the Sir Charles Clore Research Prize; the Laboratory in Memory of Leon and Blacky Broder, Switzerland; and the European Research Council.
<urn:uuid:6a9a30ab-8f81-4f69-913f-772810724135>
CC-MAIN-2022-49
https://www.weizmann.ca/photons-stopped-in-time/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711475.44/warc/CC-MAIN-20221209181231-20221209211231-00365.warc.gz
en
0.947537
1,503
3.9375
4
Quantum superposition has been used to compare data from two different sources more efficiently than is possible, even in principle, on a conventional computer. The scheme is called “quantum fingerprinting” and has been demonstrated by physicists in China. It could ultimately lead to better large-scale integrated circuits and more energy-efficient communication. Quantum fingerprinting offers a way of minimizing the amount of information that is transferred between physically separated computers that are working together to solve a problem. It involves two people – Alice and Bob – each sending a file containing n bits of data to a third-party referee, whose job is to judge whether or not the two files are identical. A practical example could be a security system that compares a person’s fingerprint to a digital image. Proposed theoretically in 2001, quantum fingerprinting can make a comparison in an exponentially more efficient way than is possible using conventional computers. While the only way to ensure a complete comparison is to send the two files in their entirety, it turns out that a reasonably accurate comparison can be achieved classically by sending just the square root of the number of bits. Quantum mechanics allows comparisons with even less data because a quantum bit (qubit) of information can exist not just as a zero or a one but, in principle at least, also in an infinite number of intermediate states. The vast increase in the number of possible combinations of states for a given number means that the number of physical bits that need to be transmitted scales logarithmically with the number of bits in the two files. As such, quantum fingerprinting permits an exponential reduction in data-transmission rates over classical algorithms. The original proposal for quantum fingerprinting involved using log n highly entangled qubits, which Norbert Lütkenhaus of the University of Waterloo in Canada says is still many more qubits than can be implemented using today’s technology. In 2014 he and Juan Miguel Arrazola, now at the National University of Singapore, unveiled a more practical scheme. This involves Alice and Bob encoding their n bits in the optical phase of a series of laser pulses, and then sending those pulses to a beam splitter (the referee). The pairs of pulses arrive at the beam splitter one at a time – if the two pulses have the same phase they exit from one port, whereas opposite phases cause them to leave from a second port. In this way, the two files are judged to be identical if there is no signal at the second port. The ramp up in efficiency is due to the fact that each pulse can be made from a tiny fraction of a single photon. This means that, on average, the pulses contain less than one photon, which is achieved by attenuating the laser light. This means n pulses can be encoded using just log n photons. As Lütkenhaus points out, the number of photons cannot be made arbitrarily small because there needs to be a reasonable chance that a photon is detected when the phases are different, for the referee to obtain the right answer: that the files are or are not identical. “The scheme gives us an asymptotically accurate result,” he says. “The more photons I put in, the closer I get to the black and white probability.” Last year, Lütkenhaus and Arrazola, working with Hoi-Kwong Lo, Feihu Xu and other physicists at the University of Toronto, put the scheme into practice by modifying a quantum-key-distribution system sold commercially by the firm ID Quantique in Geneva. They showed that they could match files as large as 100 megabits using less information than is possible with the best-known classical protocol. They did admit, however, that their scheme, while more energy efficient, took more time to carry out. Now, a group led by Jian-Wei Pan and Qiang Zhang of the University of Science and Technology of China in Hefei has beaten not only the best existing classical protocol but the theoretical classical limit (which is some two orders of magnitude lower). The researchers did so by using more tailor-made equipment – in particular, they employed superconducting rather than standard avalanche photon detectors, which reduced the number of false-positive signals from the beam splitter and so improved the accuracy of the yes/no outputs, and designed a novel kind of interferometer. Pan and colleagues successfully compared two roughly two-gigabit video files by transmitting just 1300 photons along 20 km of spooled fibre-optic cable, which is about half of what would be needed classically. Next, they plan to test their system by placing Alice, Bob and the referee at different points in a city such as Shanghai. Despite Pan’s demonstration, Lütkenhaus thinks that quantum fingerprinting probably won’t be commercialized because its superiority over classical systems depends on fairly artificial conditions, such as the referee being unable to talk back to Alice and Bob. However, he says that the research “opens the door” to other, potentially more useful, applications. One example is database searching when the searcher doesn’t have access to the whole database, while the owner of the database can’t see the search terms. “For this, we have made a protocol but not the technology,” he says. The work is reported on the arXiv preprint server.
<urn:uuid:789eb857-b38d-4750-9609-a37642567677>
CC-MAIN-2022-49
https://physicsworld.com/a/alice-and-bob-have-their-quantum-fingerprints-checked/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710192.90/warc/CC-MAIN-20221127041342-20221127071342-00566.warc.gz
en
0.943676
1,127
3.984375
4
While the word “quantum” has only started trending in the technology space during the last decade, many past technologies already relied on our understanding of the quantum world, from lasers to MRI imaging, electronic transistors, and nuclear power. The reason quantum has become so popular lately is that researchers have become increasingly better at manipulating individual quantum particles (light photons, electrons, atoms) in ways that weren’t possible before. These advances allow us to harness more explicitly the unique and weird properties of the quantum world. They could launch yet another quantum technology revolution in areas like sensing, computation, and communication. What’s a Quantum Computer? The power of quantum computers comes chiefly from the superposition principle. A classical bit can only be in a 0 or 1 state, while a quantum bit (qubit) can exist in several 0 and 1 state combinations. When one measures and observes the qubit, it will collapse into just one of these combinations. Each combination has a specific probability of occurring when the qubit collapses. While two classical bits can only exist in one out of four combinations, two quantum bits can exist in all these combinations simultaneously before being observed. Therefore, these qubits can hold more information than a classical bit, and the amount of information they can hold grows exponentially with each additional qubit. Twenty qubits can already hold a million values simultaneously (220), and 300 qubits can store as many particles as there are in the universe (2300). However, to harness this potential processing power, we must understand that probabilities in quantum mechanics do not work like conventional probabilities. The probability we learned about in school allowed only for numbers between 0 and 1. On the other hand, probabilities in quantum mechanics behave as waves with amplitudes that can be positive or negative. And just like waves, quantum probabilities can interfere, reinforcing each other or cancelling each other out. Quantum computers solve computational problems by harnessing such interference. The quantum algorithm choreographs a pattern of interference where the combinations leading to a wrong answer cancel each other out. In contrast, the combinations leading to the correct answer reinforce each other. This process gives the computer a massive speed boost. We only know how to create such interference patterns for particular computational problems, so for most problems, a quantum computer will only be as fast as a conventional computer. However, one problem where quantum computers are much faster than classical ones is finding the prime factors of very large numbers. How Quantum Computers Threaten Conventional Cryptography Today’s digital society depends heavily on securely transmitting and storing data. One of the oldest and most widely used methods to encrypt data is called RSA (Rivest-Shamir-Adleman – the surnames of the algorithm’s designers). RSA protocols encrypt messages with a key that results from the multiplication of two very large numbers. Only someone who knows the values of these two numbers can decode the message. RSA security relies on a mathematical principle: multiplying two large numbers is computationally easy, but the opposite process—figuring out what large numbers were multiplied—is extremely hard, if not practically impossible, for a conventional computer. However, in 1994 mathematician Peter Shor proved that an ideal quantum computer could find the prime factors of large numbers exponentially more quickly than a conventional computer and thus break RSA encryption within hours or days. While practical quantum computers are likely decades away from implementing Shor’s algorithm with enough performance and scale to break RSA or similar encryption methods, the potential implications are terrifying for our digital society and our data safety. In combination with private key systems like AES, RSA encrypts most of the traffic on the Internet. Breaking RSA means that emails, online purchases, medical records, company data, and military information, among many others, would all be more susceptible to attacks from malicious third parties. Quantum computers could also crack the digital signatures that ensure the integrity of updates to apps, browsers, operating systems, and other software, opening a path for malware. This security threat has led to heavy investments in new quantum-resistant encryption. Besides, existing private key systems used in the enterprise telecom sector like AES-256 are already quantum resistant. However, even if these methods are secure now, there is no guarantee that they will remain secure in the future. Someone might discover a way to crack them, just as it happened with RSA. Quantum Key Distribution and its Impact on the Telecom World Given these risks, arguably the most secure way to protect data and communications is by fighting quantum with quantum:protect your data from quantum computer hacking by using security protocols that harness the power of quantum physics laws. That’s what quantum key distribution (QKD) does: QKD uses qubits to generate a secret cryptographic key protected by the phenomenon of quantum state collapse. If an attacker tries to eavesdrop and learn information about the key, they will distort the qubits irreversibly. The sender and receiver will see this distortion as errors in their qubit measurements and know that their key has been compromised. Quantum-safe encryption will take part in people’s day-to-day lives through upgrades to laptops, phones, browsers, and other consumer products. However, most of the burden for quantum-safe communication will be handled by businesses, governments, and cloud service providers that must design and install these systems. It’s a hugely complex change that’s on par with upgrading internet communications from IPv4 to IPv6. Even if practical quantum computers are not yet available, it’s essential to begin investing in these changes, as explained by Toshiba Chief Digital Officer Taro Shimada: “Sectors such as finance, health and government are now realizing the need to invest in technology that will prepare and protect them for the quantum economy of the future. Our business plan goes far deeper and wider than selling quantum cryptographic hardware. We are developing a quantum platform and services that will not only deliver quantum keys and a quantum network but ultimately enable the birth of a quantum internet”. Toshiba expects the QKD market to grow to approximately $20 billion worldwide in FY 2035. How Photonics Impacts QKD Qubits can be photons, electrons, atoms, or any other system that can exist in a quantum state. However, using photons as qubits will likely dominate the quantum communications and QKD application space. We have decades of experience manipulating the properties of photons, such as polarization and phase, to encode qubits. Thanks to optical fiber, we also know how to send photons over long distances with relatively little loss. Besides, optical fiber is already a fundamental component of modern telecommunication networks, so future quantum networks can run on that existing fiber infrastructure. All these signs point towards a new era of quantum photonics. Photonic QKD devices have been, in some shape or form, commercially available for over 15 years. Still, factors such as the high cost, large size, and the inability to operate over longer distances have slowed their widespread adoption. Many R&D efforts regarding quantum photonics aim to address the size, weight, and power (SWaP) limitations. One way to overcome these limitations and reduce the cost per device would be to integrate every QKD function—generating, manipulating, and detecting photonic qubits—into a single chip. The further development of the integrated quantum photonics (IQP) chip is considered by many as a critical step in building the platform that will unlock quantum applications in much the same way as integrated circuits transformed microelectronics. In the coming articles, we will discuss more how to combine photonic integration with quantum technologies to address the challenges in quantum communications. If you would like to download this article as a PDF, then please click here.
<urn:uuid:6ee74347-9415-4746-84f2-0175d65d17f8>
CC-MAIN-2022-49
https://effectphotonics.com/points-of-view/an-introduction-to-qkd/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710710.91/warc/CC-MAIN-20221129164449-20221129194449-00726.warc.gz
en
0.925917
1,593
3.9375
4
From designing new polymers and pharmaceuticals to modeling climate change and cracking encryption, quantum computing’s potential applications have sparked a global quantum arms race. What is Quantum Computing? Since the birth of the single-chip microprocessor 50 years ago, computers have performed calculations by manipulating bits of information – ones and zeros – using tiny transistors baked into silicon chips. Modern processors cram tens of billions of transistors into a chip the size of a fingernail. Quantum computing does away with transistors. Instead, the ones and zeros – dubbed “qubits” – are recorded by changing the state of quantum objects, for example changing the magnetic orientation or “spin” of elementary particles like electrons. Today’s most powerful quantum computers can only string together a few dozen qubits, but they are already putting the most powerful traditional supercomputers to shame at some tasks. It’s not simply a question of raw processing power. While the electrical charge of a single transistor can either represent a one or a zero, a single qubit can actually represent both one and zero simultaneously thanks to the quirks of quantum mechanics. This allows quantum computers to process multiple outcomes simultaneously and dramatically reduce the number of steps required to tackle complex problems – solving them in minutes rather than millennia. Who Is Leading the Way? Credit: Quantum Computing by IBM, by Microsoft, Google’s Sycamore, Alibaba’s supercomputer Using the building blocks of the universe to power the next generation supercomputers might seem like science fiction, but quantum computing is already a reality. The US and China are pouring billions of dollars into research and development, while Europe is also investing heavily and breakthroughs are occurring around the globe. Along with universities, private sector tech giants such as IBM, Microsoft, Google, Amazon, Alibaba and Baidu are also paving the way. At the same time, startups are working to solve some of the challenges which must be overcome for quantum computing to reach its full potential. In October 2019, Google’s Californian research lab became the first to achieve “quantum supremacy”, performing a calculation that would be practically impossible for even the most powerful classical supercomputer. Google’s 53-qubit Sycamore processor performed a calculation in 200 seconds which would have taken the world’s most powerful supercomputer 10,000 years. The University of Science and Technology of China achieved quantum supremacy only 14 months later, claiming its Jiuzhang quantum computer to be 10 billion times faster than Google’s. What Challenges Lay Ahead? While quantum supremacy is a major achievement, if quantum computing is a moonshot then quantum supremacy is only the equivalent of Yuri Gagarin’s first space flight. Many challenges still lie ahead and fully-fledged, fault-tolerant quantum computers may still be more than a decade away. So far, quantum supremacy has only been achieved using computers and calculations especially designed to demonstrate quantum computing’s strengths, but not to solve real-world problems. A key milestone will be to achieve “practical” quantum supremacy when tackling real-world challenges, says Professor Andrea Morello. Winner of the American Physical Society‘s inaugural Rolf Landauer and Charles H. Bennett Award in Quantum Computing, Morello leads one of the University of New South Wales’ quantum computing research teams in Sydney, Australia. Practical quantum supremacy may still be a decade away, Morello says. It is difficult to predict which problem will be solved first, but one possibility is calculating a chemical reaction in order to synthesize a new pharmaceutical. Achieving practical quantum supremacy will require error correction and fault tolerance, similar to traditional computers. Error correction proves challenging at the quantum level, where qubits are highly susceptible to interference and only remain stable for milliseconds, Morello says: “Google’s quantum supremacy was achieved using ‘uncorrected’ qubit gates and, while this is impressive, error correction becomes important when you’re aiming for practical quantum supremacy so you can trust the outcome enough to apply it to the real world. Quantum error correction has been demonstrated in the laboratory and right now a lot of resources are being invested into bringing it to fruition.” How Are Quantum Computers Used Today? Summit supercomputer (Credit: Oak Ridge National Laboratory) While progress continues towards practical quantum supremacy, intermediate quantum computers still offer an advantage over classical computers in certain optimized applications, says GlobalData graduate analyst Sam Holt. “Fully-fledged, universal and fault-tolerant quantum computers may be more than a decade away, but a flurry of recent partnerships have explored use cases on intermediate devices. In January 2021, for example, Roche announced a collaboration with Cambridge Quantum Computing to develop quantum simulations for new drug discovery for Alzheimer’s disease.” Roche employs noisy-intermediate-scale-quantum (NISQ) algorithms that lack error correction but are still useful for some tasks. Another intermediate approach to quantum computing proposes installing low-qubit processors alongside traditional processors to act as “quantum accelerators”. This allows certain aspects of processing to benefit from the quantum advantage, similar to the way a CPU can hand off specific tasks to a dedicated graphics card. Even once practical quantum supremacy is achieved, Holt says it is likely that businesses in a wide range of industries will choose to rent time on cloud-based quantum computers rather than invest in their own hardware. “Quantum cloud offerings from companies such as IBM are enabling widespread quantum computing. Quantum computing’s primary applications are in simulation, optimization, linear algebra and factorisation. These capabilities are increasingly becoming key requirements across a wide array of industries. Companies in these fields that are not at least investigating how quantum may transform their business risk getting left behind.” What Are the Applications for Quantum Computing? Even when error correction and practical quantum supremacy are achievable, traditional computers will still be considerably smaller, cheaper and more practical for most calculations, Morello says: “Using a quantum computer to solve most problems is like using a 747 to go to the supermarket. Just like a jumbo jet, quantum computing proves its worth when you need to do the heavy lifting.” Chemistry is shaping up as quantum computing’s first killer application, potentially helping humanity address some of its greatest challenges. Today the production of ammonia, the main ingredient of fertilizer, requires high-temperature furnaces which consume 2% of the world’s energy and produce 1% of its CO2 output. Bacteria can produce ammonia at room temperature and quantum computing may be the key to understanding and replicating this process. In manufacturing, quantum computing could be used to develop new chemicals, polymers, and alloys. Industrial manufacturing still struggles to duplicate many materials with astonishing properties which exist in nature, such as spider silk. By weight, spider silk is comparable with steel when it comes to tensile strength, but silk is not forged in a furnace. Because spider silk is a protein made by DNA, quantum computing’s superior ability to model at a subatomic level may unlock the ability to manufacture similar materials in an eco-friendly way, Morello says: “Quantum computing is a truly disruptive technology that can have gigantic value for science, for industry and for society. It’s such a genuinely transformational technology that the vast majority of its applications will be things we haven’t even thought of yet – quantum computing will help open up new worlds.”
<urn:uuid:4ad24b1a-24de-45bd-818f-9a0059e07eeb>
CC-MAIN-2022-49
https://emag.directindustry.com/2021/09/28/the-race-to-become-the-worlds-first-quantum-computing-superpower-ibm-microsoft-google/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710417.25/warc/CC-MAIN-20221127173917-20221127203917-00047.warc.gz
en
0.90203
1,579
3.9375
4
Quantum computing is a theoretical computing model that uses a very different form of data handling to perform calculations. The emergence of quantum computing is based on a new kind of data unit that could be called non-binary, as it has more than two possible values. A traditional computer works on bits of data that are binary, or Boolean, with only two possible values: 0 or 1. In contrast, a quantum bit, or "qubit," has possible values of 1, 0 or a superposition of 1 and 0, in the case of an unknown value. According to scientists, qubits are based on physical atoms and molecular structures. However, many find it helpful to theorize a qubit as a binary data unit with superposition. Quantum Computing Fundamentals All computing systems rely on a fundamental ability to store and manipulate information. Current computers manipulate individual bits, which store information as binary 0 and 1 states. Quantum computers leverage quantum mechanical phenomena to manipulate information. To do this, they rely on quantum bits, or qubits. Three quantum mechanical properties — superposition, entanglement, and interference — are used in quantum computing to manipulate the state of a qubit. Superposition: Superposition refers to a combination of states we would ordinarily describe independently. To make a classical analogy, if you play two musical notes at once, what you will hear is a superposition of the two notes. Entanglement: Entanglement is a famously counter-intuitive quantum phenomenon describing behavior we never see in the classical world. Entangled particles behave together as a system in ways that cannot be explained using classical logic. Interference: Finally, quantum states can undergo interference due to a phenomenon known as phase. Quantum interference can be understood similarly to wave interference; when two waves are in phase, their amplitudes add, and when they are out of phase, their amplitudes cancel. Quantum Computing Models There are a number of quantum computing models, distinguished by the basic elements in which the computation is decomposed. The four main models of practical importance are: - Quantum gate array (computation decomposed into a sequence of few-qubit quantum gates) - One-way quantum computer (computation decomposed into a sequence of one-qubit measurements applied to a highly entangled initial state or cluster state) - Adiabatic quantum computer, based on quantum annealing (computation decomposed into a slow continuous transformation of an initial Hamiltonian into a final Hamiltonian, whose ground states contain the solution) - Topological quantum computer(computation decomposed into the braiding of anyons in a 2D lattice) The quantum Turing machine is theoretically important but the direct implementation of this model is not pursued. All four models of computation have been shown to be equivalent; each can simulate the other with no more than polynomial overhead. Quantum Computers Vs. Conventional Computers Although people often assume that quantum computers must automatically be better than conventional ones, that's by no means certain. So far, just about the only thing we know for certain that a quantum computer could do better than a normal one is factorization: finding two unknown prime numbers that, when multiplied together, give a third, known number. In 1994, while working at Bell Laboratories, mathematician Peter Shor demonstrated an algorithm that a quantum computer could follow to find the "prime factors" of a large number, which would speed up the problem enormously. Shor's algorithm really excited interest in quantum computing because virtually every modern computer (and every secure, online shopping and banking website) uses public-key encryption technology based on the virtual impossibility of finding prime factors quickly (it is, in other words, essentially an "intractable" computer problem). If quantum computers could indeed factor large numbers quickly, today's online security could be rendered obsolete at a stroke. But what goes around comes around, and some researchers believe quantum technology will lead to much stronger forms of encryption. (In 2017, Chinese researchers demonstrated for the first time how quantum encryption could be used to make a very secure video call from Beijing to Vienna.) Does that mean quantum computers are better than conventional ones? Not exactly. Apart from Shor's algorithm, and a search method called Grover's algorithm, hardly any other algorithms have been discovered that would be better performed by quantum methods. Given enough time and computing power, conventional computers should still be able to solve any problem that quantum computers could solve, eventually. In other words, it remains to be proven that quantum computers are generally superior to conventional ones, especially given the difficulties of actually building them. Who knows how conventional computers might advance in the next 50 years, potentially making the idea of quantum computers irrelevant—and even absurd. History of Quantum Computing Quantum computing tends to trace its roots back to a 1959 speech by Richard P. Feynman in which he spoke about the effects of miniaturization, including the idea of exploiting quantum effects to create more powerful computers. This speech is also generally considered the starting point of nanotechnology. Of course, before the quantum effects of computing could be realized, scientists and engineers had to more fully develop the technology of traditional computers. This is why, for many years, there was little direct progress, nor even interest, in the idea of making Feynman's suggestions into reality. In 1985, the idea of "quantum logic gates" was put forth by the University of Oxford's David Deutsch, as a means of harnessing the quantum realm inside a computer. In fact, Deutsch's paper on the subject showed that any physical process could be modeled by a quantum computer. Nearly a decade later, in 1994, AT&T's Peter Shor devised an algorithm that could use only 6 qubits to perform some basic factorizations ... more cubits the more complex the numbers requiring factorization became, of course. A handful of quantum computers has been built. The first, a 2-qubit quantum computer in 1998, could perform trivial calculations before losing decoherence after a few nanoseconds. In 2000, teams successfully built both a 4-qubit and a 7-qubit quantum computer. Research on the subject is still very active, although some physicists and engineers express concerns over the difficulties involved in upscaling these experiments to full-scale computing systems. Still, the success of these initial steps does show that the fundamental theory is sound. Applications of Quantum Computing Quantum computing could: - Speed up the development of drugs; improve chemical industry manufacturing; desalinate seawater; and even suck carbon dioxide out of the atmosphere to curb climate change. - Result in the invention of room temperature superconductors that would be impervious to power drain during electrical transmission. - Handle problems of image and speech recognition, and provide real-time language translation. - Greatly enhance big data processing from sensors, medical records and stock fluctuations. - And generate many other similarly important applications not yet imaginable. The Advantages and Disadvantage of Quantum Computing Advantages of Quantum Computing - The main advantage of quantum computing is it can execute any task very faster when compared to the classical computer, generally the atoms changes very faster in case of the traditional computing whereas in quantum computing it changes even more faster. But all the tasks can’t be done better by quantum computing when compared to traditional computer. - In quantum computing qubit is the conventional superposition state and so there is an advantage of exponential speedup which is resulted by handle number of calculations. - The other advantage of quantum computing is even classical algorithm calculations are also performed easily which is similar to the classical computer. Disadvantages of Quantum Computing - The main disadvantage of computing is the technology required to implement a quantum computer is not available at present. The reason for this is the consistent electron is damaged as soon as it is affected by its environment and that electron is very much essential for the functioning of quantum computers. - The research for this problem is still continuing the effort applied to identify a solution for this problem has no positive progress. Artificial Intelligence (AI) Artificial Neural Network (ANN) - Definition: What is Quantum Computing? Techopedia - Quantum Computing Fundamentals IBM - Quantum Computing Models Wikipedia - What can quantum computers do that ordinary computers can't? Explain That Stuff - History of Quantum Computing ThoughtCo - Possible Applications of Quantum Computing OWDT - The Advantages and Disadvantage of Quantum Computing 1000projects.org
<urn:uuid:13015833-b047-493b-a608-aba40e5647a9>
CC-MAIN-2022-49
https://cio-wiki.org/wiki/Quantum_Computing
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446708010.98/warc/CC-MAIN-20221126144448-20221126174448-00086.warc.gz
en
0.924298
1,821
3.84375
4
Learn about parallel computing, the rise of heterogeneous processing (also known as hybrid processing), and the prospect of quantum engineering as a field of study! Parallel computing used to be a way of sharing tasks between processor cores. When processor clock rates stopped increasing, the response of the microprocessor companies was to increase the number of cores on a chip to increase throughput. But now, the increased use of specialized processing elements has become more popular. A GPU is a good example of this. A GPU is very different from an x86 or ARM processor and is tuned for a different type of processing. GPUs are very good at matrix math and vector math. Originally, they were designed to process pixels. They use a lot of floating point math because the math behind how a pixel value is computed is very complex. A GPU is very useful if you have a number of identical operations you have to calculate at the same time. GPUs used to be external daughter cards, but in the last year or two the GPU manufacturers are starting to release low power parts suitable for embedded applications. They include several traditional cores and a GPU. So, now you can build embedded systems that take advantage of machine learning algorithms that would have traditionally required too much processing power and too much thermal power. This is an example of a heterogeneous processor (AMD) or hybrid processor. A heterogeneous processor contains cores of different types, and a software architect figures out which types of workloads are processed by which type of core. Andrew Chen (professor) has predicted that this will increase in popularity because it’s become difficult to take advantage of shrinking the semiconductor feature size. This year or next year, we will start to see heterogeneous processors (MOOR) with multiple types of cores. Traditional processors are tuned for algorithms on integer and floating point operations where there isn’t an advantage to doing more than one thing at a time. The dependency chain is very linear. A GPU is good at doing multiple computations at the same time so it can be useful when there aren’t tight dependency chains. Neither processor is very good at doing real-time processing. If you have real time constraints – the latency between an ADC and the “answer” returned by the system must be short – there is a lot of computing required right now. So, a new type of digital hardware is required. Right now, ASICs and FPGAs tend to fill that gap, as we’ve discussed in the All about ASICs podcast. Quantum cores (like we discussed in the what is quantum computing podcast) are something that we could see on processor boards at some point. Dedicated quantum computers that can exceed the performance of traditional computers will be introduced within the next 50 years, and as soon as the next 10 or 15 years. To be a consumer product, a quantum computer would have to be a solid state device, but their existence is purely speculative at this point in time. Quantum computing is reinventing how processing happens. And, quantum computers are going to tackle very different types of problems than conventional computers. There is a catalog on the web of problems and algorithms that would be substantially better on a quantum on a computer than a traditional computer. People are creating algorithms for computers that don’t even exist yet. The Economist estimated that the total spend on quantum computing research is over 1 Billion dollars per year globally. A huge portion of that is generated by the promise of these algorithms and papers. The interest is driven by this. Quantum computers will not completely replace typical processors. Lee’s opinion is that the quantum computing industry is still very speculative, but the upsides are so great that neither the incumbent large computing companies nor the industrialized countries want to be left behind if it does take off. The promise of quantum computing is beyond just the commercial industry, it’s international and inter-industry. You can find long whitepapers from all sorts of different governments laying out a quantum computing research strategy. There’s also a lot of venture capitalists investing in quantum computing. Is this research and development public, or is there a lot of proprietary information out there? It’s a mixture, many of the startups and companies have software components that they are open sourcing and claim to have “bits of physics” working (quantum bits or qbits), but they are definitely keeping trade secrets. 19:50 Quantum communication means space lasers. Engineering with quantum effects has promise as an industry. One can send photons with entangled states. The Chinese government has a satellite that can generate these photons and send them to base stations. If anyone reads them they can tell because the wave function collapsed too soon. Quantum sensing promises to develop accelerometers and gyroscopes that are orders of magnitude more sensitive than what’s commercially available today. Quantum engineering could become a new field. Much like electrical engineering was born 140 years ago, electronics was born roughly 70 years ago, computer science was born out of math and electrical engineering. It’s possible that the birth of quantum engineering will be considered to be some point in the next 5 years or last 5 years. Lee’s favorite quantum state is the Bell state. It’s the equal probability state between 1 and 0, among other interesting properties. The Bell state encapsulates a lot of the quantum weirdness in one snippet of math.
<urn:uuid:be6e49d0-af52-4155-953a-0ec33415250d>
CC-MAIN-2022-49
https://eestalktech.com/heterogeneous-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711114.3/warc/CC-MAIN-20221206192947-20221206222947-00850.warc.gz
en
0.951295
1,180
3.765625
4
In quantum teleportation, the properties of quantum entanglement are used to send a spin state (qubit) between observers without physically moving the involved particle. The particles themselves are not really teleported, but the state of one particle is destroyed on one side and extracted on the other side, so the information that the state encodes is communicated. The process is not instantaneous, because information must be communicated classically between observers as part of the process. The usefulness of quantum teleportation lies in its ability to send quantum information arbitrarily far distances without exposing quantum states to thermal decoherence from the environment or other adverse effects. Although quantum teleportation can in principle be used to actually teleport macroscopic objects (in the sense that two objects in exactly the same quantum state are identical), the number of entangled states necessary to accomplish this is well outside anything physically achievable, since maintaining such a massive number of entangled states without decohering is a difficult problem. Quantum teleportation, is, however, vital to the operation of quantum computers, in which manipulation of quantum information is of paramount importance. Quantum teleportation may eventually assist in the development of a "quantum internet" that would function by transporting information between local quantum computers using quantum teleportation . Below is a sketch of an algorithm for teleporting quantum information. Suppose Alice has state C, which she wants to send to Bob. To achieve this, Alice and Bob should follow the sequence of steps: 1) Generate an entangled pair of electrons with spin states A and B, in a particular Bell state: Separate the entangled electrons, sending A to Alice and B to Bob. 2) Alice measures the "Bell state" (described below) of A and C, entangling A and C. 3) Alice sends the result of her measurement to Bob via some classical method of communication. 4) Bob measures the spin of state B along an axis determined by Alice's measurement Since step 3 involves communicating via some classical method, the information in the entangled state must respect causality. Relativity is not violated because the information cannot be communicated faster than the classical communication in step 3 can be performed, which is sub-lightspeed. The idea of quantum teleportation, which can be seen in the mathematics below, is that Alice's measurement disentangles A and B and entangles A and C. Depending on what particular entangled state Alice sees, Bob will know exactly how B was disentangled, and can manipulate B to take the state that C had originally. Thus the state C was "teleported" from Alice to Bob, who now has a state that looks identical to how C originally looked. It is important to note that state C is not preserved in the processes: the no-cloning and no-deletion theorems of quantum mechanics prevent quantum information from being perfectly replicated or destroyed. Bob receives a state that looks like C did originally, but Alice no longer has the original state C in the end, since it is now in an entangled state with A. Which of the following is true of quantum teleportation? 1) Quantum information is transferred between states 2) The teleported particle is physically transferred between locations 3) A quantum state is cloned between observers 4) Quantum information is permanently removed from the system As a review, recall the Pauli matrices: The spin operators along each axis are defined as times each of for the axes respectively. These Pauli matrices are used to construct Bell states, an orthonormal basis of entangled states for the tensor product space of spin- particles: Measurements that project tensor products of spin states onto the Bell basis are called Bell measurements. Now, follow the algorithm sketched in the previous section. Suppose Alice starts with state C, which she wants to send Bob. State C can be written in the most general form: with and normalized complex constants. 1) Generate an entangled pair of electrons A and B in the Bell state: The state of the full system of three particles is therefore . This is a product state between entangled pair AB and non-entangled C. 2) Alice measures the Bell state of AC, entangling A and C while disentangling B. The process of measuring the Bell state projects a non-entangled state into an entangled state, since all four Bell states are entangled. Expanding Alice's full original state, she starts with: Multiplying out the states and changing to the Bell basis of A and C, this state can be rewritten: When Alice measures the Bell state of A and C, she will find one of , each with probability . Whichever she measures, the state of particle B will be after measurement. 3) To send Bob the state of particle C, therefore, Alice does not need to send Bob the possibly infinite amount of information contained in the coefficients and which may be real numbers out to arbitrary precision. She needs only to send the integer of the Bell state of A and C, which is a maximum of two bits of information. Alice can send this information to Bob in whatever classical way she likes. 4) Bob receives the integer from Alice that labels the Bell state that she measured. After Alice's measurement, the overall state of the system is: Bob therefore applies to the disentangled state on his end, by measuring the spin along axis . Since for all , Bob is left with the overall state: Bob has therefore changed the spin state of particle B to: which is identical to the original state of particle C that Alice wanted to send. The information in state C has been "teleported" to Bob's state: the final spin state of B looks like C's original state. Note, however, that the particles involved never change between observers: Alice always has A and C, and Bob always has B. - Pirandola, S., & Braunstein, S. Physics: Unite to build a quantum Internet. Retrieved from http://www.nature.com/news/physics-unite-to-build-a-quantum-internet-1.19716 - Debenben, . quantum teleportation diagram. Retrieved from https://commons.wikimedia.org/w/index.php?curid=34503176
<urn:uuid:2c87920e-e545-4214-9506-39bbac689684>
CC-MAIN-2022-49
https://brilliant.org/wiki/quantum-teleportation/?subtopic=quantum-mechanics&chapter=multiparticle-systems
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710719.4/warc/CC-MAIN-20221130024541-20221130054541-00090.warc.gz
en
0.927272
1,288
3.984375
4
The theory behind quantum computing was first laid out in the 1980s. Yet, it was not until recently that practice caught up with theory, enabling the construction of the first quantum computers. An unchallenged pioneer in this technology is the Canadian company D-Wave Systems. Its clients include the CIA and the National Security Agency (NSA), many research institutes, NASA, and businesses including Google and Lockheed Martin. The European Union plans to allocate a billion euros to quantum research. Tech companies are developing their own technologies anticipating diverse applications for the awesome computational power that can be derived from quanta, the fundamental building blocks of matter. The evening of Moore’s Law Why is so much being spent on quantum computing? Why is it such a huge breakthrough? Today’s processors are made up of billions of transistors a few nanometers in size, packed into a very small space. According to Moore’s Law, the number of transistors that fit into a microprocessor doubles roughly every two years.Unfortunately, or inevitably, increases in the processing power of chips have been plateauing. We are approaching the technological limits of how many transistors can be jammed into such a small space. The borderline that cannot be crossed is a transistor the size of a single atom with a single electron used to toggle between the states of 0 and 1. The simplest way to demonstrate the advantages of the quantum computer is to compare it with the classical machine. The familiar device we know from our daily work relies for all its operations on basic information units called bits. These, however, can only represent two states: 0 or 1. In quantum computing, it’s possible to use intermediate, non-binary states that liberate us from the bondage of 0 and 1, two opposing values. The qubit(or quantum bit), which is what the information units used by quantum devices are called, can assume the values of 0 and 1 simultaneously. In fact, qubits can assume an infinite number of states between0 and 1, achieving what is referred to as the superposition. Only when the value of a qubit is observed does it ever assume either of the two basic states: 0 or 1. This may seem like a minor difference, but a qubit remaining in superposition can perform multiple tasks at the same time. We are helped here by the operation of two fundamental laws of quantum physics. Physically, a qubit can be represented by any quantum set to two different basic states: two energy levels in an atom, or two levels of photon polarization, vertical or horizontal. Therefore, while a bit in a classical computer holds one of two values (0-1 or 1-0), and two bits hold one of four values, and so on, two qubits hold not two but four values at any given time while 16 qubits may hold as many as 65536 values simultaneously, or 16 squared. The number of possibilities doubles for every qubit added, allowing a quantum machine to process far more data than can a binary computer in an incredibly short time. Imagine a volume of data so big it would take millions of years to process by means of a classical computer. This would not be a problem for a quantum machine. It can process data hundreds of thousands and, ultimately, millions of times faster than machines made up of even the most sophisticated silicon components. The difference in capacity between quantum and conventional computers can theoretically amount to an astounding 1:18 000 000 000 000 000 000 times! Such a computer could sift through and recognize objects in a giant collection of photographs. It would be perfect for big number processing, encryption and code breaking. Or, blockchain breaking. The kiss of death for cryptocurrencies According to some researchers, once quantum computers rise and spread, they could be used to crack the cryptographic protections responsible for the operating model and security of blockchain technology – the technology on which cryptocurrencies are based. Collectively, on January 3, 2018, cryptocurrencies were worth an estimated USD 700 billion. This certainly makes them worth fighting for. What makes blockchain technology vulnerable to the threat of quantum computers?Blockchain architecture is protected by two types of security keys: private and public. To make a cryptocurrency transaction, the buyer shares a public key with its seller, while the latter uses a private key to acknowledge receipt. Should anyone other than the seller or buyer acquire the private key, they would gain control of the transaction. The private key can either be stolen or broken by the brute force of enormous computational power. The emergence and spread of quantum computers will render the blockchain technology’s algorithms useless. A holder of a quantum computer will be able to calculate the private key using the public key.This will give the code holder unfettered access to all world’s wallets holding all the world’s cryptocurrencies. However, even though it can crack a private key in minutes, the cost of a quantum computer will make that a very expensive operation. But $700 billion is a powerful incentive. Not all is lost The easiest way to secure keys in the face of quantum computing would be to have the cryptocurrency community adopt a more sophisticated set of cryptographic standards. The technology to do so is out there. However, any modifications require the consent of the entire cryptocurrency community, with separate consents for each cryptocurrency. Considering that a recent attempt to get all users to agree to an increase in the volume of bitcoin (BTC) blocks – from 2MB to 4MB – has failed miserably, reaching a consensus for upping security standards may prove equally elusive. The blockchain protocol requires 80% of currency users to approve any change. Since doubling the bandwidth, and significantly accelerating transactions would benefit everyone, that would appear to be a no-brainer. And yet, as it turned out, not everyone saw it that way. On the other hand, by the time quantum computers become widely available, the cryptocurrency community may well recognize the threat and begin to see eye to eye on updating cryptographic standards. That would keep blockchain and the cryptocurrency technology secure from quantum computers well into the future. Devilishly fast but not unlimited A quantum computer requires a control system (the equivalent of an operating system), algorithms to make quantum calculations and proper calculation software. The development of quantum algorithms is very difficult as they need to rely on the principles of quantum mechanics. The algorithms followed by quantum computers rely on the rules of probability. What this means is that by running the same algorithm on a quantum computer twice, one may get completely different results as the process itself is randomized. To put it simply, to arrive at reliable calculation with a quantum computer, one must factor in the laws of probability – a complex process indeed! Quantum computers are suited for very specialized and specific calculations as well as algorithms that help harness all their powers. In other words, quantum computers will not appear on every desk or in every home. However, regardless of how much time is needed to generate a given result by means of an algorithm, we can imagine, even today, a situation in which a quantum machine, and only a quantum machine, could solve a problem that mankind desperately needs to solve. Quantum computer IBM 4
<urn:uuid:952c0933-37ab-44f2-abce-aa02f7b1936b>
CC-MAIN-2022-49
https://norbertbiedrzycki.pl/en/will-quantum-computers-the-doom-the-blockchain/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710801.42/warc/CC-MAIN-20221201053355-20221201083355-00052.warc.gz
en
0.943257
1,460
4.15625
4
Companies like IBM or Google have already unveiled the first quantum computers in history. This technological innovation represents an advance comparable to that of the arrival of the first computers in the mid-20th century. As its name indicates, quantum computing may seem like another advancement in traditional computing, but that’s not the truth; this is a technology radically different from the one used in our computers. It will take time for quantum computing to reach the home environment precisely because of this. However, that does not prevent quantum computing from having a large number of applications that we will discover. We’ll explain what quantum computing is, and what uses will be given to it in the near future. What is quantum computing? To understand how quantum computing works, it is helpful to remember how classical computing works. A traditional computer uses a binary system, based on the bit as the fundamental unit of information. That means that all the elements of the computer translate the electrical impulse into 1, if the voltage is high, or 0 if it is low or null. This system makes it possible to represent numbers and perform different logical operations with them. However, it has a fundamental limitation: the numbers do not change by themselves, but each of them must be deliberately changed by a mathematical operation, which consumes energy and time. Quantum computing introduces a very important quantitative leap: the minimum unit is the qubit, which can have a value of 1, 0 or both at the same time in different percentages (for example, 1 to 60% and 0 to 40%). This allows a great variety of intermediate states, which are achieved through processes such as superposition or entanglement. These processes make it possible to perform calculations beyond the capabilities of a classical computer. The main advantage of quantum computers is the optimization of data processing. In fact, quantum computing will not replace classical computers, but will be combined with them in a hybrid structure: the traditional computing device can send data and instructions to the quantum computer, which processes the data at high speed and returns it. Applications of quantum programming are virtually endless. Disciplines such as chemistry, medicine, logistics, economics and agriculture will benefit from the processing and calculation of complex data at high speed. Another field in which it will become vitally important will be artificial intelligence and online security: the power of a quantum computer will allow technological devices to analyze data and react to it much faster. Origin of quantum computing Although the first practical applications of quantum computing are very recent, they are all based on quantum physics, a theory that developed over the past century. Albert Einstein and Max Planck observed that light does not propagate in a continuous wave, but in several different sets or quanta. Subsequent quantum mechanical investigations found that these units overlap, resulting in several physical states overlapping simultaneously. Although superposition made it possible to conceive a quantum computer in the mid-20th century, another problem arose: Quantum physics showed that there were intermediate states, but classical computing would always read them as bits. Using the example above, a traditional computer would process a qubit from 1 to 60% and 0 to 40% and interpret it as 1. What allowed the development of quantum computing was entanglement. This process allowed the discovery of Shor’s algorithm and quantum temper, which sped up the calculation of minimum values and prime factors. This makes the computer capable of encrypting intermediate states and processing data at high speed. Differences between usual computing and quantum computing We have already explored some differences between classical and quantum computing: the basic unit they use, the language derived from them, and the speed of processing. These factors lead to radical differences in application: quantum computing is capable of executing algorithms that a classical computer would take thousands of years to perform, unless it had unlimited memory. Quantum computers differ fundamentally in their operation, as well as in their construction: IBM’s quantum computer is a device kept in glass and covered in cables, and it does not have conventional devices such as screens or keyboards. There are two reasons for this: First, they are currently able only to process information, so they do not require an interface. Secondly, they work under very strict conditions: they require a temperature of -273 ° C and have superconducting components. Progress and Challenges in Quantum Computing Considering the difficulty of building and maintaining a quantum computer, it is clear that the widespread application of this new technology will take a few more years. However, there have already been some significant advances in quantum computing: the first quantum computer was introduced in 1998, and Shor’s algorithm was run for the first time only three years later. At the beginning of this century, the D-Wave company was at the forefront of progress in quantum computing: in 2007 it managed to execute quantum tempering with 16 qubits, and that same year it introduced a 2000 qubit computer. IBM has already introduced devices capable of running other algorithms, so we can expect new milestones to be achieved in the coming years. However, a number of challenges facing quantum computing today need to be addressed first. Quantum computers have a very limited calculation time after which the information will lose its precision. This is because qubits are very unreliable and easily miscalculated. Furthermore, the hybrid technology between classical and quantum computing requires the development of quantum algorithms, before which it will be difficult for technological advances to be applied to common devices. In summary, it appears that quantum computing will only be available to companies that perform computationally expensive calculations. For example, companies such as Google and Microsoft will use them to develop machine learning or replicate biochemical processes, and security agencies will use them to decipher encrypted codes and increase security. Ordinary users will need to wait before seeing any results in their homes, but the exponential growth of quantum computing is very promising.
<urn:uuid:4feace0d-3928-4821-a4aa-85e4a53c2c49>
CC-MAIN-2022-49
https://mobileworldcapital.com/en/quantum-computing-performance/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711712.26/warc/CC-MAIN-20221210042021-20221210072021-00732.warc.gz
en
0.952784
1,188
3.75
4
By manipulating Quantum Structures in the Sun’s Atmosphere entanglement of electricity can be achieved. What is electricity Electricity is the set of physical phenomena associated with the presence and motion of matter that has a property of electric charge. Electricity is related to magnetism, both being part of the phenomenon of electromagnetism, as described by Maxwell’s equations. Various common phenomena are related to electricity, including lightning, static electricity, electric heating, electric discharges and many others. Electricity is carried by moving electrons. In the gases that make up air, these electrons are normally strongly attached to the molecules that they form. However, during a lighting strike, they’re ripped away and can move about, allowing electricity to flow. This leaves behind positively charged molecules. The mix of electrons and positive molecules is called a plasma. When the negative electrons recombine with the positively charged gas to re-form stable molecules, visible light is given off. That’s what you see. Visible light is one way energy uses to get around. Light waves are the result of vibrations of electric and magnetic fields, and are thus a form of electromagnetic (EM) radiation. Visible light is just one of many types of EM radiation, and occupies a very small range of the overall electromagnetic spectrum. We can, however, directly sense light with our own eyes, thus elevating the role of this narrow window in the EM spectrum because of its significance to us. The sun’s magnetic field Similar to our own planet, the sun is like a huge bar magnet with a north and a south pole producing a magnetic field. But the sun’s magnetic field is about twice as strong as the Earth’s and much, much larger, extending well beyond the farthest planet in the solar system. The Magnetic Field Magnetic fields are produced by moving electric charges. Everything is made up of atoms, and each atom has a nucleus made of neutrons and protons with electrons that orbit around the nucleus. Since the orbiting electrons ≠are tiny moving charges, a small magnetic field is created around each atom. These magnetic fields have a specific orientation or direction; this orientation is called the atom’s magnetic moment. Basically, all of the atoms in an object act like several tiny magnets. In most materials, all of these moments face random directions and they all cancel each other out, and there is a net magnetization of 0, which means the object will not be a magnet. However, when all or most of these moments align in the same direction, the entire object has a net magnetization and creates a magnetic field around itself. Hidden Portals in Earth’s Magnetic Field “We call them X-points or electron diffusion regions,” explains plasma physicist Jack Scudder of the University of Iowa. “They’re places where the magnetic field of Earth connects to the magnetic field of the Sun, creating an uninterrupted path leading from our own planet to the sun’s atmosphere 93 million miles away.” Observations by NASA’s THEMIS spacecraft and Europe’s Cluster probes suggest that these magnetic portals open and close dozens of times each day. They’re typically located a few tens of thousands of kilometers from Earth where the geomagnetic field meets the onrushing solar wind. Most portals are small and short-lived; others are yawning, vast, and sustained. Tons of energetic particles can flow through the openings, heating Earth’s upper atmosphere, sparking geomagnetic storms, and igniting bright polar auroras. THE MAGNETIC FIELD PORTALS POSITIONS AND LOCATIONS REMAIN TOP SECRET AND CLASSIFIED Macroscopic Quantum Energy By manipulating the Quantum state of electrons in the solar wind that interacts with the Earth’s Magnetic Field creating portals to the sun’s atmosphere you could create Macroscopic Quantum Structures that use Quantum Entanglement to entangle Electrons in the atmosphere of the Sun and connect those entangled packets of energy to Earth instantly. The energy would be teleported via quantum entanglement. Currently Electricity flows on lines and wires and distributed to the consumer. Meters are installed to measure the amount of energy units being used. If one of the lines connecting this flow of energy is disruption and it supplies you personally with power you lose power until the line is reconnected. The Macroscopic Quantum Structures will be the hubs where this energy is being generated. The Electricity generated by the Sun’s atmosphere can be converted remotely and entangled. Once entangled the energy is transported instantaneously to a transfer station on Earth or in Space. The transfer station then distributes the entangled Electricity to larger networks globally to be distributed to the consumer directly with the proper device to convert this entangled Electricity directly into flowing Electricity. No wires needed. With Macroscopic Quantum Energy and Macroscopic Quantum Communication it sends and receives packets of information outside of the radio spectrum. It instead focuses on the frequency of interest (Electrons and Photons) which are easily manipulated with Quantum Technology. Using what Einstein referred to as “Spooky Action at a Distance ” Quantum Entanglement of all of our communications and energy needs can be securely sent and received globally at little to no costs. Quantum Energy would benefit all of humanity. It would revolutionize the power industry and give every citizen on this planet access to free forms of energy naturally formed in our universe in addition to communication technology. You would no longer need Power Plants and instead would generate all electricity on Earth from space with energy from our star the Sun. The Sun releases energy at a mass–energy conversion rate of 4.26 million metric tons per second, which produces the equivalent of 38,460 septillion watts (3.846×1026 W) per second. In 2019, the world primary energy consumption was 13.9 billion toe (ton of oil equivalent). With a world population of about 7.7 billion, we now have a world average consumption of primary energy of 58 kWh per day per person. WITH QUANTUM ENERGY WE COULD HARNESS ENORMOUS AMOUNTS OF ENERGY in the amount of 38,460,000,000,000,000,000,000,000,000 watts per second. (38,460 septillion watts (3.846×1026 W) per second.) This energy is always around us and currently due to our lack of technological advancement it has remained unacessable to humans until now. Rather than using Earth’s natural resources to produce energy on Earth that pollutes the atmosphere of Earth with toxins we can NOW theoretically use the natural resources of space and harness the powerful energy of our star using Advanced Macroscopic Quantum Structures.Expand the research
<urn:uuid:2e91dcce-3e06-493d-83a2-a158d4e0b210>
CC-MAIN-2022-49
https://bentlights.com/publications/macroscopic-quantum-energy/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711111.35/warc/CC-MAIN-20221206161009-20221206191009-00733.warc.gz
en
0.915698
1,413
3.5625
4
Think back a second. When was it that you got your first smartphone? What about the first time that you streamed a show online? Those things were available to us around 12-15 years ago, depending on how tech-savvy you were at that time. Now, though, smartphones and fast computers are ubiquitous. Not only that, but they’re affordable. The cutting-edge technology just keeps slicing deeper and deeper to the point that we’re used to advanced progress. We expect to be amazed, then we get bored of our amazement and look for the next thing. That said, is computer processor speed just going to keep getting better? We’re going to look at this question today, giving you some insights into the world of technology and where it’s headed. Let’s get started. How Do Computer Processors Work? To start this discussion, we have to know a few things about how computer processors work. A few basic insights into CPUs allow us to have a better grasp of what the future might hold. A central processing unit (CPU) is considered the brain of the computer. It’s where all of the complex tasks take place, and it manages everything you do while you use a device. The CPU reaches into the random access memory and hard drive storage to get information in a matter of milliseconds. It also interacts with your graphics processing unit to generate all of the beautiful images and 3D renderings you engage with on-screen. The processor consists of 10s of millions of transistors made of semiconductor materials. Simply put, a semiconductor allows or blocks electrical signals to flow, depending on the situation. The Importance of Transistors As a semiconductor, a transistor manages electrical signals in either a positive or negative fashion. When it’s positive to the current, it allows it to continue or directs it in the right way. When negative, that signal is stopped. It’s like a little traffic cop that stops and starts traffic to keep things flowing smoothly. This little device is the absolute building block for all computers and pieces of modern technology. It might not seem like that’s very complex or that it could power something as influential as the iPhone. That said, these devices are all just the result of small electrical signals getting directed to produce specific, mechanical actions. When you press a single key on your keyboard, there’s a simple and elegant process that takes place. The button sends a signal to the CPU, which then sends a signal to the screen, and the letter pops up in an instant. That process is reflective of almost any process you do on the computer. It’s simple, but the complexity compounds each time you press another button. In the case of the transistor, that little traffic cop gets multiplied by orders of magnitude and placed in a microchip. The microchip is an essential worker for the CPU. A chip the size of your fingernail holds billions (yes, billions) of transistors. Moore’s Law and The Future of Technology At some point, the only devices available had ten or twenty transistors in them. That was some time back in the sixties or seventies when computer technology took hold. The more transistors you include in a device, though, the better it is. When they’re placed on a microchip, they’re said to be included in an “integrated circuit.” When you increase the ability of an integrated circuit to house transistors, you improve the quality of the device in question. One of the founders of Intel computers, Gordon Moore, proposed an idea. He said that, so long as the price stays consistent, the integrated circuit will be able to house double the number of components every 18 to 24 months. As a result, the performance of technology will be twice as good as it was a year and a half prior. His law held up for the first twenty years of the computer. Since then, it has had years when advancement fell behind his estimate and years when it surpassed his estimate. That said, the slope of Moore’s law and the slope of microprocessor ability are eerily close to one another. If nothing else, we can look to Moore’s law to estimate roughly how good technology will be in the near and distant future, barring any big changes to the situation. It will keep doubling and improving ad infinitum in that case, though. Can we be sure that that will happen? How Can Things Improve? The thing about Moore’s law is that it was created when one couldn’t foresee the technology we have now. Technology breeds paradigm shifts, and that’s what we can expect in the next decades if Moore’s law is correct until then. We’ll hypothetically reach a point when we no longer need transistors and microchips at all. People are already producing transistors that are the size of a handful of atoms pushed together. That’s approaching the size of the fundamental building blocks of the universe as far as we know. What lies beyond that advancement is difficult to say, but things are accelerated by the fact that computers are actually doing the thinking for us in some instances. There are more neurons in the human mind than microchips in the smartest computer, but that doesn’t mean that computers aren’t better at thinking logically and recalling information than we are. Artificial intelligence thinks critically in real-time, and it might be able to produce better computers than we can. Is Quantum Computing Just Science Fiction? Quantum computers are already in existence, although they’re not as powerful as classical computers with microchips yet. Yet is the keyword, though. The science hasn’t gotten narrowed down into perfection as of yet, but the idea is that artificial intelligence will keep chipping away at the stone until David emerges. Quantum computing plays on the random nature of quantum states like entanglement, superposition, and more. Without getting too deep into the terminology, it might help to understand, basically, what those things are. Quantum mechanics state that particles and waves exist to different degrees at different times and their existence is relative to the observer at a particular time. Entanglement is an instance when the particle and wave occupy the same space in such a way that the observer can’t say that either one doesn’t exist. Superimposition suggests that both particle and wave are atop one another in an instance that produces a third, equally viable state. Those things are heady enough as it is, but introduce computing into the mix and you’ve got a real brain-melter. The result is that computers will work trillions of times faster than ours do. The implications of that are hard to imagine, especially for our consumer technology. What To Expect From Computer Processor Speed Whether or not Moore’s law is correct, we can be sure that things will improve. Provided that there’s no extreme climate disaster or global collapse, technology will improve. Phones, computers, and other devices are essential to the lifestyles of billions of people on earth. There’s a lot of money waiting for the individuals or companies that think up new ways to improve our lives through technology. There are also a lot of issues on planet earth that something like quantum computing could fix. Supply chain management, hunger, poverty, and numerous other essential problems might get solved by a more intelligent computer. So, there are more than enough carrots dangling in front of humanity to push the technology cart forward. Whether that will keep happening in a way that doubles every couple of years, only time will tell. That said, quantum computing advancements will be a paradigm shift for the entire idea of technology. The speed of our computers today was almost unimaginable 30 years ago. Things are incredibly fast and easy to use now. You can get the scoop on modern computers and start enjoying them if you’re not already. Where Will It End? If things scale up at an exponential rate as they have, it’s impossible to imagine what the state of technology could be. Just like people 100 years ago would faint if they saw a smartphone, we might do the same if we saw what was possible 20 years from now. The difference for us is that things change at an exponential rate. What would have taken 100 years might take only ten now. Ten years from now, it’ll only take one year to do what took us ten, and so on and so forth. If things keep multiplying upon themselves like that, the only question is “where does it all end?” Will the singularity come and take us over? Will we merge with technology in some way? Science fiction has to take the reins from that point on. Want to Learn More About Computer Chips? Hopefully, our look at computer processor speed was interesting to you. There’s a lot more to learn and keep track of as things move forward, though. We’re here to keep you filled in. Explore our site for more ideas on technology, central processing unit insights, processor cores, and much more.
<urn:uuid:74441611-06fd-4bcd-8eb0-f248d0997e24>
CC-MAIN-2022-49
https://theblogspost.com/how-innovation-is-driving-your-computer-processor-speed/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710918.58/warc/CC-MAIN-20221203011523-20221203041523-00613.warc.gz
en
0.943811
1,921
3.65625
4
Artificial intelligence will track down gravity lenses2022.09.15 12:26 - Marek Pawłowski Images of distant galaxies, distorted by powerful gravitational lenses, are visually the most out-of-the-box phenomena photographed by telescopes. Their automatic detection is difficult for many reasons. During the international workshop in Warsaw, dedicated to machine learning, scientists from the National Center for Nuclear Research demonstrated theoretical models and software that deal with this task with high efficiency and reliability. In the coming years, astronomers expect the influx of a huge number of photos, mainly from large-scale sky surveys. Abundant observational material gives hope for groundbreaking discoveries, but requires the development of automated image analysis tools capable of reliably classifying astronomical objects captured in photographs. The National Center for Nuclear Research (NCBJ) has already made a contribution in this field: scientists from Świerk have created and tested a set of models built on neural networks and trained to detect strong gravitational lenses. The achievement Was presented, among others, at the WMLQ 2022 (International Workshop on Machine Learning and Quantum Computing Applications in Medicine and Physics) co-organized by NCBJ, dedicated to improving machine learning methods and their applications in physics and medicine. „Strong gravitational lensing is so difficult to see that until five years ago, we only knew a few hundred cases in the entire cosmos,” says PhD student Hareesh Thuruthipilly (NCBJ), first author of a paper in the scientific journal Astronomy & Astrophysics. „Photographs from the sky surveys that are just beginning should increase this number to hundreds of thousands within a decade. However, there is a condition: tools that will optimize the work of astronomers must be developed. We demonstrate that our theoretical models and software are already capable of reliably detecting candidates for strong gravity lenses." Gravitational lenses are a consequence of general relativity, in which mass is one of the physical quantities capable of curving space-time. The paths of photon motion, rectilinear in flat space-time, bend in space-time around a large mass, which gives the observer the impression that they came from slightly different directions than the original one. An ordinary focusing lens deflects a light ray the farther it is from its optical axis. A gravitational lens works differently: the deflection of the light beam is greater the closer it is to the center of the lens. This feature causes the images of lensed objects to blur into more or less bent streaks. In the optimal setting, when, from the viewer’s point of view, the lens focuses light rays passing through it on all sides, the image of the lensed object will be stretched into a circle called the Einstein’s ring. The influence of low-mass objects on the shape of space-time is negligible. However, when a galaxy with a mass of many billions of solar masses becomes the lensing object, spectacular views can be expected. However, in practice, detection is so difficult, that the first gravitationally lensed object Was not spotted until 1979. „Only one massive galaxy in ten thousand creates images of lensing,” shows the scale of the challenges Thuruthipilly, MSc. „The shapes of these pictures are unusual, and the long distance makes the photos small and not the best brightness. Moreover, in the overwhelming number of cases, the orientation of the lensing galaxy and the object behind it is suboptimal, and only bits of streak can be seen. As if the problems weren’t enough, sometimes not one galaxy is involved in lensing, but several galaxies, which results in additional image distortions." In order to determine the optimal methods for the detection of gravity lenses, the NCBJ team prepared five models built on relatively simple neural (convolutional) networks and 21 models operating on more complex networks (with a self-attention mechanism). Each model Was trained separately on 18,000 images of simulated gravity lenses. Ultimately, the effectiveness of the network Was checked on computer-generated one hundred thousand photos from the Bologna Lens Challenge database, to make it difficult to supplement with actual photos from the Kilo Degree Survey (KiDS). „Self-attentive neural networks did much better” – says PhD Adam Zadrożny from the NCBJ Astrophysics Department. „With just three million parameters, they achieved results comparable to those of convolutional networks with 23 million parameters. Identification of the candidates Was correct in more than nine out of ten cases. The results of our work therefore suggest that when it comes to detecting powerful gravitational lenses, the future belongs to self-attentive models." Detecting a large number of strong gravitational lenses is important in determining the applicability of general relativity and in studying the evolution of the universe. Currently, a small number of such lenses does not allow them to be used to estimate the values of the most important parameters calibrating modern cosmological models. However, if automated detection methods qualitatively expand the pool of known lenses, a new source of information will open up for cosmologists. The WMLQ 2022 (International Workshop on Machine Learning and Quantum Computing Applications in Medicine and Physics) workshops are dedicated to issues related to machine learning and quantum computing and the possibilities of their use in physics and medicine. The workshops are held in Warsaw from 13 to 16 September at the seat of the Polish Chamber of Commerce. The event is organized by the National Center for Nuclear Research in cooperation with scientists from the Jagiellonian University and the University of Vienna. More information: https: //events. ncbj. gov. pl/event/141/ „Finding strong gravitational lenses through self-attention” H. Thuruthipilly, A. Zadrożny, A. Pollo, M. Biesiada Astronomy & Astrophysics 664, A4 (2022) DOI:https: //doi. org/10.1051/0004-6361/202142463 Photo 1. Examples of strong gravitational lenses photographed by the Hubble telescope. In the lower left year there is a clear Einstein ring. (Sources: Hubble / NASA / ESA)
<urn:uuid:8850ff63-8a06-4320-ad2c-d91cd05e5fb3>
CC-MAIN-2022-49
https://www.ncbj.gov.pl/en/aktualnosci/artificial-intelligence-will-track-down-gravity-lenses
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711394.73/warc/CC-MAIN-20221209080025-20221209110025-00014.warc.gz
en
0.908082
1,302
3.609375
4
Quantum computers can perform certain kinds of optimization problems much faster than classical computers. One example is finding the ground state of a quantum system, which can be used to optimize the performance of a quantum device. A quantum computer is a computer that uses quantum mechanics to store and process information. The basic principle behind quantum computing is that a quantum bit (qubit) can represent a zero and a one at the same time, and that quantum computers can exploit this fact to solve certain problems much faster than classical computers. In recent years, there has been a lot of interest in using quantum computers for optimization problems. Optimization problems are problems where we are trying to find the best possible solution from a set of possible solutions. For example, we might want to find the shortest route between two cities, or the cheapest way to make a given product. There are many different algorithms that have been developed for quantum computers that can be used to solve optimization problems. For now, let's focus on one particular algorithm, called the quantum approximate optimization algorithm (QAOA). The QAOA is a quantum algorithm that can be used to find the minimum of a function. It works by first preparing a special state called a superposition, which is a combination of all the possible solutions to the problem. The algorithm then uses a series of quantum operations to find the solution that has the lowest energy. The QAOA has been used to solve a number of different optimization problems, including the traveling salesman problem and the knapsack problem. In this example, we will use the QAOA to solve a simple optimization problem called the maximum cut problem. The maximum cut problem is an optimization problem where we are given a graph, and our goal is to find the largest possible set of edges that can be cut from the graph without disconnecting it. For example, consider the following graph: If we wanted to cut as many edges as possible from this graph, we could cut the edges highlighted in red, which would disconnect the graph into two pieces. In this case, the maximum number of edges that can be cut is four. The maximum cut problem is a difficult problem to solve, but it can be solved using the QAOA. QAOA is a heuristic algorithm, meaning that it is not guaranteed to find the optimal solution, but it can often find very good solutions. QAOA works by encoding the problem into a quantum state, and then using a series of unitary operations to evolve the state. The final state is then measured, and the result is the solution to the problem. There are a few different ways to encode the problem into a quantum state. One common way is to use a Hamiltonian that encodes the constraints of the problem. For example, if the problem is to find the shortest path between two points, the Hamiltonian would encode the constraint that the path must be a certain length. Once the Hamiltonian is encoded, the QAOA algorithm proceeds in two steps. In the first step, called the "preparation step", a unitary operation is applied to the state that creates a superposition of all the possible solutions. In the second step, called the "evolution step", a series of unitary operations are applied that depend on the Hamiltonian. These operations cause the state to evolve in such a way that the solutions that are "closer" to the optimum are more likely to be measured. Finally, the state is measured, and the result is the solution to the problem. Maxcut is a problem in graph theory that seeks to find the maximum number of edges that can be cut from a given graph. It is a NP-hard problem, meaning that it is believed to be computationally intractable. However, recent advances in quantum computing have led to the development of algorithms that can solve Maxcut on a quantum computer in polynomial time. The algorithm that we will use to solve Maxcut on a quantum computer is called the Quantum Approximate Optimization Algorithm (QAOA). QAOA is a variationally algorithm that uses a quantum computer to find the approximate solution to an optimization problem. In order to use QAOA to solve Maxcut, we first need to encode the graph into a quantum state. This can be done using the well-known Ising model. The Ising model is a model of a system of spins that interact with each other via the Ising interaction. In our case, the spins will represent the vertices of the graph, and the Ising interaction will represent the edges of the graph. We can then use QAOA to find the maximum number of edges that can be cut from the graph, by finding the configuration of spins that minimizes the Ising interaction. It should be noted that QAOA is not a perfect algorithm, and it will not always find the optimal solution to the Maxcut problem. However, it is a very powerful algorithm that can find very good solutions to Maxcut in polynomial time. In general, the optimization problem can be expressed as follows: min x f ( x ) g i ( x ) = 0 , i = 1 , 2 , … , m h j ( x ) ≤ 0 , j = 1 , 2 , … , p where x is the decision vector to be optimized, f ( x ) is the objective function, and g i ( x ) and h j ( x ) are the constraint functions. The quantum algorithm for solving optimization problems is to encode the objective function and the constraint functions into a quantum state, and then use a quantum circuit to search for the optimal solution through quantum interference. The quantum state encoding objective function and constraint functions is usually expressed as follows: | ψ ⟩ = a | x ⟩ + ∑ i b i | g i ( x ) ⟩ + ∑ j c j | h j ( x ) ⟩ where the superposition | x 〉 of all potential solutions is usually the leading term to be optimized, and | g i ( x ) 〉 and | h j ( x ) 〉 are the superposition of all solutions that violate the constraint conditions. The coefficient a is set to 1 to ensure that all solutions are encoded in the quantum state, and the coefficients b i and c j can be set to 0 or 1. In the quantum optimization algorithm, the quantum state is evolved by a quantum circuit, which is composed of a unitary operator U and a measurement operator M. The operation of the quantum circuit is as follows: U | ψ ⟩ → | ψ ′ ⟩ = U | ψ ⟩ M | ψ ′ ⟩ → | ψ ′ ′ ⟩ = M | ψ ⟩ U is a unitary operator composed of many basic gates, and M is a general measurement operator. The general unitary operator U can be expressed as follows: U = e − i α X Δ t e − i β Z Δ t e − i γ Y Δ t where X, Y, and Z are the Pauli operators, and X and Y correspond to the constraint functions g i and h j . The quantum state | ψ ′ ⟩ after evolution by the quantum circuit U is given by: | ψ ′ ⟩ = a | x ⟩ + b × e − i Δ t ( β | 0 ⟩ + γ | 1 ⟩ ) + c × e − i Δ t ( β | 1 ⟩ − γ | 0 ⟩ ) From Equation (8), we can see that, for the quantum state | ψ ′ ⟩ , the leading term | x 〉 is only evolved by the unitary operator U, and all other terms are evolved by the unitary operator U multiplied by a phase factor. The different phases of the terms cause interference between the terms, and the terms that are less beneficial to the optimization are partially canceled out. Therefore, after the quantum state | ψ ′ ⟩ is evolved by the quantum circuit, the probability of measuring the quantum state | ψ ′ ⟩ is proportional to the objective function f ( x ) , which is expressed as: P ( x ) = | ‖ ψ ′ ⟩ | 2 ∝ f ( x ) The above equation shows that the probability of measuring the quantum state | ψ ′ ⟩ is proportional to the objective function f ( x ) . Therefore, the objective function can be optimized by repeatedly measuring the quantum state | ψ ′ ⟩ Optimizing the layout of a quantum circuit Minimizing the number of quantum gates in a quantum circuit Minimizing the number of quantum operations in a quantum algorithm Reducing the error rate of a quantum computer Improving the fidelity of a quantum state Optimizing the control of a quantum system Quantum computers can be used to optimize the schedule of a production line. This can lead to a significant reduction in manufacturing costs. Quantum computers can be used to optimize the management of supply chains. This can lead to a significant reduction in inventory levels and an improvement in customer satisfaction. Quantum computers can be used to optimize quality control procedures. This can lead to a significant reduction in product defects. If you're looking for a way to solve optimization problems on a quantum computer, look no further! Our quantum computer can help you solve optimization problems quickly and efficiently. Contact us today to learn more about how our quantum computer can help you solve optimization problems.
<urn:uuid:4379e45a-bcd6-42c5-b70c-946fb7197116>
CC-MAIN-2022-49
https://silicofeller.com/optimization
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710691.77/warc/CC-MAIN-20221129100233-20221129130233-00494.warc.gz
en
0.932384
2,001
4.46875
4
There are many different models that we can use to describe how particles interact with each other in the quantum world. We can also refer to these models as systems. A system is a set of parts that form a complex whole and has order to it.One of these systems is a two-level or two-state system. This system is sometimes abbreviated as a TLS. A simple way of picturing this type of system is a coin. A coin is a single object with two sides to it. In the quantum world, the two sides of the coin would have two possible quantum states. A quantum state is a state of a quantized system that is described by a set of quantum numbers. A quantum number is a number that expresses the value of some property of a particle which occurs in the quanta There are several examples of these systems in the quantum world: Spin. Spin is one of the four basic quantum numbers. It is the intrinsic angular momentum It defines the spin given to a particle. For the two level system, spin can exists as counter clockwise and clockwise. It can have a value of either +1/2 or -1/2. There is a special name given to these type of particles. They are called fermions. Fermions obey the Pauli Exclusion Principle. This means that no two particles in the same energy level have the same properties or states. Think about the coin, it has a head on one side and a building(or eagle) on the other side.. There are no two same images per coin. This is the same with spin as a two-level system. One particle has a -1/2 spin while the other particle has a +1/2 spin. Protons, neutron, electron, neutrinos, and quarks are all fermions. The transition of an atom from an excited state to a ground state . This is not necessarily a quantum system.Because photons are involved, this can be classified as a quantum system and be called an “atom-light” interaction. Using the coin, you have the excited state on one side and the ground state on the other side. The excited state is where the atom jumps to when energy is added. The ground state is the lowest energy level of the atom. There are two processes that happen between the ground state and the excited state. These processes are absorption and emission. Absorption happens when the atom absorbs a photon . this causes the atom to become excited. Emission happens when the atom falls to ground state and releases a photon. There are actually two types of emission. There is stimulated emission and spontaneous emission. An example of spontaneous emission would be radioactive decay. An example of stimulated emission is a laser. The difference between the two types of emission is that stimulated emission requires an induced electromagnetic field. This means that an electromagnetic field has to be introduced to the system to cause emission . Spontaneous emission, on the other hand, occurs naturally. With our coin, we can imagine that the coin has been forced to spin or is infinitely flipping, this action demonstrates how absorption and emission are constantly occurring. The ammonia molecule. The nitrogen of ammonia has two molecular states. These states are “up “and “down”. Once again, on one side of the coin, you have “up” and on the other, you have “down”.These two states are non-degenerate. When something is non-degenerate, it does not have the same quantum energy level. In this situation, when excitation of the molecule happens, vibration is caused by the absorption and re-emission of photons. This is similar to tossing a slinky back and forth in your hand.This quantum phenomena allows the ammonia molecule to have its pyramidal shape and allows ammonia to be used a source for a special type of a laser called a “maser”.MASER stands for Microwave Amplification of Stimulated Emission of Radiation. The qubit. The qubit is used in quantum computing. Like the bit that is used in regular computing, the qubit is the unit of quantum information used in quantum computing. Unlike the bit, the qubit can have a 0 and 1 at the same time. A common example of the two states used in the qubit is polarization. On one side of the coin, there is vertical polarization and on the other, horizontal polarization. You have the value of 0 and perhaps horizontal polarization. While, on the other side, you have the value of 1 and vertical polarization. The qubit reveals an interesting property about our quantum coin. This property is called superposition. This basically means that two states are existing at the same time. This is also called entanglement. Entanglement is when collective properties are shared. In this case, the collective or common property is polarization; vertical and horizontal. The doublet. Doublets are spectral lines of an ionized gas that have been split into two lines under the influence of a magnetic field. The doublet would have +1/2 on one side of the coin and -1/2 on the other side of the coin. The doublet reveals another unique feature about our quantum coin. This feature is called rotational symmetry. This means that , regardless of how you rotate the coin, the value is still ½. The concept of the two-level or two-state quantum system is being studied more as researchers seek to refine the idea of quantum computing. Though there are systems other than the qubit. The other systems have helped researchers understand how to manipulate and develop the qubit.
<urn:uuid:7f6eb505-5460-4f5e-a69a-8ca92c4a9a89>
CC-MAIN-2022-49
https://blogs.scientificamerican.com/guest-blog/the-quantum-coin-a-simple-look-at-the-two-state-quantum-system/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446706285.92/warc/CC-MAIN-20221126080725-20221126110725-00854.warc.gz
en
0.94741
1,163
3.90625
4
Quantum computers can lead to breakthroughs in a wide variety of subject areas because they offer a computational strength we’ve never seen before. However, not all problems are favorable for a quantum computer. In order to identify which problems make good candidates, it’s important to have an understanding of how a quantum computer solves problems. While quantum computers can offer an exponential boost in computational power, they can’t be programmed in the same way as a classical computer. The instruction set and algorithms change, and the resulting output is different as well. On a classical computer, the solution is found by checking possibilities one at a time. Depending upon the problem, this can take too long. A quantum computer can explore all possibilities at the same time, but there are a few challenges. Getting the right answer out of the computer isn’t easy, and because the answers are probabilistic, you may need to do extra work to uncover the desired answer. For example, assume you wanted to page-rank the internet. To do so, the process would require loading every single page as input data. On a classical machine you would create a computation that gives you the page rank of each page, but this takes time and a significant amount of hardware. With a quantum computer, computation is exponentially faster than on classical hardware. But the caveat is that with quantum, your result will typically be the page rank of one page. And then you’d have to load the whole web again to get another, and do it again to get another, and continue until you eventually have the page rank for the entire internet. Because you have to load everything each time, the exponential speedup is lost. This example would not be favorable for quantum computing. To solve any problem, you’ll have input, computation, and output. - Input – The data required to run the computation - Computation – The instructions given to the computer to process the data - Output – The useful result received from the computation Instead of returning the entire quantum state, a quantum computer returns one state as the result of a computation. This unique characteristic is why we write the algorithm in such a way that produces the desired answer with the highest probability. For this reason, problems that require a limited number of values are more applicable. The amount of input data is also a consideration. As input data increases, either the number of qubits or the amount of work to ‘prepare’ the data grows quickly. Problems with highly compressed input data are more much more favorable. What types of problems are ideal challenges for a quantum computer? Quantum computers are best-suited for solving problems with a limited volume of output, and—ideally—those with a limited amount of input. These restrictions might lead you to assume that the scope of what quantum computers can do is narrow, but the exact opposite is true. Quantum computers provide a level of computational power that allows us to tackle some of the biggest challenges we face. The nuance is in framing problems in a way that makes them solvable. Here are some great examples of how a quantum computer can be used to address some of today’s biggest challenges. Modelling molecules is a perfect application for quantum computing. In Richard Feynman’s own words, “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.” While we have an accurate understanding of organic molecules—those with S and P orbitals—molecules whose orbitals interact with each other are currently beyond our ability to model accurately. Many of the answers we need to address significant issues, such as world hunger and global warming, come by way of understanding these more difficult molecules. Current technology doesn’t allow us to analyze some of the more complex molecules, however, this is an excellent problem for a quantum computer because input and output are small. There’s a unique approach in quantum computing where, instead of loading the input data, you’re able to encode it into the quantum circuit itself. Modelling molecules are an example of this; the initial positions of the electrons would be the input—also referred to as ‘preparation’—and the final positions of the electron would be the output. Modelling materials is essentially in the same problem class as modelling molecules, which means quantum computers are also helpful in identifying new possibilities in material science. The ability to develop high-temperature superconductors is a great example. We currently lose around 15% of the power in the energy grid every year due to the resistance in the wires transporting the electricity. Finding a material that can transmit energy without heating up the wires requires modelling properties of materials, a process very similar to modelling molecules. Again, this precise focus has a minimal amount of input and a highly focused output—both great candidates for quantum computing. In addition, materials have a regular structure with (mostly) local interactions making them generally easier to model than chemicals on a quantum computer. Many cryptosystems are built using math problems more difficult than a classical computer is able to solve. However, a quantum computer has the computational ability to find solutions to the cryptographic algorithms in use today. Cryptographic problems that use factoring are excellent examples of problems that can be solved with a quantum computer because both the input and output are each a single number. Note that the numbers used in the key are huge, so a significant amount of qubits are needed to calculate the result. A quantum computer’s ability to solve cryptographic algorithms is an issue we take extremely seriously at Microsoft, and we are already working on quantum-safe cryptography protocols to replace those which will be vulnerable to quantum attacks. Machine learning and optimization In general, quantum computers aren’t challenged by the amount of computation needed. Instead, the challenge is getting a limited number of answers and restricting the size of the inputs. Because of this, machine learning problems often don’t make for a perfect fit because of the large amount of input data. However, optimization problems are a type of machine learning problem that can be a good fit for a quantum computer. Imagine you have a large factory and the goal is to maximize output. To do so, each individual process would need to be optimized on its own, as well as compared against the whole. Here the possible configurations of all the processes that need to be considered are exponentially larger than the size of the input data. With a search space exponentially bigger than the input data, optimization problems are feasible for a quantum computer. Additionally, due to the unique requirements of quantum programming, one of the unexpected benefits of developing quantum algorithms is identifying new methods to solve problems. In many cases, these new methods can be brought back to classical computing, yielding significant improvements. Implementing these new techniques in the cloud is what we refer to as quantum-inspired algorithms. Quantum computing brings about a paradigm shift in multiple ways: Not only will quantum computing provide access to new levels of computational ability, but it will also inspire new ways of thinking. For a quantum computer to solve some of our biggest challenges, we have to understand how to frame the problem. As we look at problems in new ways, this shift can, in turn, bring new ideas to how we approach classical computation as well. With more and more individuals considering problems from different angles, more and more ideas and solutions will result. Luckily, you don’t have to wait until quantum computers are readily available to begin considering problems in new ways—you can start today by learning quantum development. As you dive into the world of quantum development, you’ll practice your ability to think about problems in new ways, get familiar with programming a quantum computer, and even simulate your work so that you’ll be ready once quantum computers are made available. Get started today with the Microsoft Quantum Development Kit.
<urn:uuid:97216cf6-efbd-4a2f-bc61-d16eb19b4735>
CC-MAIN-2022-49
https://cloudblogs.microsoft.com/quantum/2018/04/24/understanding-how-to-solve-problems-with-a-quantum-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711278.74/warc/CC-MAIN-20221208050236-20221208080236-00054.warc.gz
en
0.927823
1,655
3.5
4
From the National Institute of Standards and Technology to your home. Learn about cutting edge random number generators in this hands-on lab. What you’ll learn: How to entangle two magnets and recreate a quantum entangled system of random number generation. Random numbers are difficult to create, but are necessary to safegaurd our personal information, online passwords, and electronic messages. While we can’t create our own truly random number generator in our living room, we can learn about the science of quantum entanglement and generate our own random numbers in a project that closely mimics that of NIST. Kids will learn what entanglement means, how binary is used in quantum entanglement, and how binary numbers are converted into digits that we recognize. Add depth to this project by reading our interview with the researcher Dr. Peter Bierhorst! A hands-on demonstration of quantum entanglement When it comes to the world of quantum mechanics it is difficult to find a basis of understanding in the real world. Why? Because the quantum world is not only very tiny, it is also very weird – for example, is a photon of light a particle? Is it a wave? Is it both…at the same time?!? What happens if a light particle is whizzing around at mind-boggling speeds near the speed of light? The New World of Mr. Tompkins by George Gamow is one of my all-time favorites for mind experiments when it comes to the quantum world and a great read if your learner wants to dive deeper. In this project, we will take a look at how two entangled photons can create the ultimate random number generator. 1. Make your entangled photons. In this project, our ‘photons’ will be round magnets that snap together. Each magnetic pair creates one set of entangled ‘photons’, one spin up, and one spin down. This is a great analogy because magnets have two states, North, and South. Take two magnets that are snapped together, break them apart, and write ‘up’ on the internal face of one magnet, and ‘down’ on the other. When the magnets are together we should not be able to see which magnet has which state – just as with entangled photons we cannot see which photon is spin up or spin down until we look at the state of one. What happens in the lab? In the lab entangled photons are created by pulsing a laser beam at a single, high energy photon inside a crystal. This photon splits into two lower energy photons that are ‘entangled’. For example, a blue photon could be split into two red photons inside the crystal. These red photons are now bound sisters with entangled energies, momenta, and polarization. In these experiments, we use the polarization to generate our signal of 0 or 1. This is because a photon can have two polarizations, up or down. What happens in the lab? In the lab, a stream of entangled photons is created as a laser beam is continuously shined upon the producing crystal. 2. Run your experiment We can’t put a bunch of our entangled photons in a bag, shake it up, and draw them out. Can you think why? Our entangled photons are really magnets. If you put a whole bunch of magnets in a bag together they will just snap together and make a long brick! To run our experiment we will put a set of entangled magnets into a small cup, shake it and remove it. The two magnets are identical on the outside – that is, you can’t tell which is spin up and which is spin down by looking at the set together (with the writing on the inside). Open the magnets face down so you can’t see which magnet (or photon) is which. Place one on your paper template, and the other goes to a nearby student. 3. Keep running your experiment. We need 4 bits of data to make the numbers 0 through 15. That means we need 4 magnets to be shaken and drawn before we can convert the spins to binary and the binary to a number. Shake an entangled set of magnets, draw them out, and place one magnet face down on each paper. Repeat this until you have filled the four spaces created for the magnets. What happens in the lab? The two entangled photons are separated from each other using a beam splitter. A beam splitter is an optical device that can split a beam into two pieces (these can be equally size beams or not, depending on the beam splitter you choose). 4. Flip over your magnets. Up until now, no one would have been able to tell you if your magnets are spin up and spin down. Your partner will have the opposite set of magnets. Think about it – if you have an entangled magnet that is spin up, what entangled magnet does your partner have? Flip over your entangled magnets and record in the circles if your magnets were spin up (with an up arrow), or spin down (with a down arrow). Below each magnet, there is a line. Write either a 1, if your magnet in the position was spin up, or a 0, if your magnet was spin down. 5. Convert to ASCII You now have a series of 4 bits (or 4 zeroes and ones). Use the chart on the left-hand side of the page to discover what number that binary code is represented by. Write this number in the box on the right. Counting in binary Binary is defined as having two states, here a 0 or a 1. If we want to count in binary we need to do so in something called base-2. Here the number 2 is really represented by the number 10. What?!? How is that possible? In base 2 each digit holder represents a 2^x. So the first digit is 2^0 which is one, and the second digit is 2^1, which is 2! 6. Re-entangle your photons We will need to run our experiment more to be able to create a string of numbers. Use the set of magnets shared between you and your partner to entangle the four sets of magnets, with the writing (spin up or spin down) on the inside, invisible to observers. 7. Run the experiment again. Go through the process of shaking the magnets, then separating them face down, one on your paper, one on your partner’s paper. Once you have another set of four magnets face down you can flip them over, record the results and convert the binary to ASCII getting yet another number. Repeat the process until you have four sets of numbers converted from binary. 8. Find your random number To find your random number string you will transcribe the numbers you found with your entangled photons. For example, if your first experiment gave you the number 5, the second 14, the third 2, and the fourth 4 your random number would be: 51424.
<urn:uuid:791a051a-cc3b-49ab-bfac-bc3594507844>
CC-MAIN-2022-49
https://rosieresearch.com/learning-quantum-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710916.40/warc/CC-MAIN-20221202183117-20221202213117-00375.warc.gz
en
0.921528
1,465
3.71875
4
Skip to main content (Press Enter). Skip auxiliary navigation (Press Enter). Skip main navigation (Press Enter). Explore All Communities Become an Advocate Earn Points and Badges Start a Discussion Hitachi Cambridge Labs Tackles the Challenge of Building a Large-Scale Quantum Computer Quantum computers promise to have a major positive impact on society. How else will we have the capability to process the exabytes of data and turn it into useful information to develop new cures for cancer, improve security, and boost artificial intelligence. However, building the hardware that will enable that paradigm change is one of the greatest technological challenges facing humanity. In 2019, which seems like a lifetime ago due to the Pandemic, Google announced that their quantum computer was the first to perform a calculation that would be practically impossible for a classical supercomputer. This is known as Quantum Supremacy. Its quantum computer, known as “Sycamore”, carried out a specific calculation that is beyond the practical capabilities of regular, ‘classical’ machines. They estimate that the same calculation would take even the best classical supercomputer 10,000 years to complete. However, the problem that was solved was not a practical application and was more proof of concept. Quantum computers work in a fundamentally different way from classical computers where a classical bit is either a 1 or a 0. In a Quantum computer, a quantum bit, or qubit, can exist in multiple states at once. This capability allows for the construction of an exponentially large computing space with only a linear number of resources, Qubits, making it exponentially more powerful than conventional computing for a specific set of tasks. When qubits are inextricably linked, physicists can, in theory, exploit the interference between their wave-like quantum states to perform calculations that might otherwise take millions of years. The one of the drawbacks with current Quantum computers is noise because it can make qubits change state at times and in ways that programmers did not intend, leading to computational errors. Most interactions with the surrounding environment, such as charge instabilities and thermal fluctuations, are sources of qubit noise. All of them can compromise information. The algorithms used by quantum computers must spend resources, qubits, to correct for this noise. Current systems are still relatively simple and as such their performance is far from what supercomputers can achieve. The first wave of development, known as noisy intermediate-scale quantum ( ) technology, is being led by two key technologies: ion traps and superconductors. Ion traps use single charged atoms trapped in electromagnetic fields as qubits. Superconductors are electrical resonators that can oscillate in two different manners simultaneously. Ion traps are being explored by companies like IonQ, Inc., Alpine Quantum Technologies, GmbH., and Honeywell International Inc., whereas superconductors are being worked on by International Business Machines Corporation (IBM), Google LLC, Alibaba Group Holding Limited, Intel Corporation, and Rigetti & Co, Inc. Systems using NISQ technology have been successfully demonstrated with up to a few tens of qubits working simultaneously. The power of a quantum computer is rated by the number of qubits that it manages. Google’s Sycamore had 64 qubits. While Google was able to achieve Quantum Supremacy with Sycamore, the problem that was solved has little practical application. In order to run the quantum algorithms that can make a real impact in society would require orders of magnitude of qubits. Predictions estimate that 10 qubits are needed to run a simulation of a simple material, and 10 qubits for an arbitrarily complex one. Scaling up to such a large number of qubits is the greatest challenge to overcome in order to fulfill the promise of quantum computing. Ion trap and superconducting qubits offer limited prospects for scalability beyond the NISQ era with current qubit densities of just 1 and 100 qubits/cm respectively. This translates into machines the size of a whole room or even a football stadium. The Hitachi Cambridge Laboratory (HCL) is developing a new technology that has the potential to solve the scaling problem, making it a leading hardware candidate for building the first general-purpose quantum computer. (Hitachi established the Hitachi Cambridge Laboratory (HCL) in collaboration with the Cavendish Laboratory of the University of Cambridge in 1985) HCL is using silicon transistors, the omnipresent device in all microprocessors, to make scalable qubits. One of the advantages of silicon is that it offers a relatively where spins can retain their quantum nature. This means that less resources will be required for error correction. The biggest attraction of silicon-based quantum processors is the ability to leverage the same technology that the microchip industry has handled for the past 60 years. This means manufacturers can expect to benefit from previous multibillion-dollar infrastructure investments, keeping production costs low. By using silicon as a basis for a quantum computer means that all the clever engineering and processing that went into developing modern classical microelectronics – from dense device packaging to integrated interconnect routing – can be adapted and used to build quantum devices. By using the same technology that is used in conventional computing, HCL aims to deliver a cost-effective chip-size solution with an unparalleled qubit density of 10 , one that could be manufactured in large quantities in silicon foundries. The proposed solution will open up quantum computing to many new companies by transferring the successful fabless business model from the microelectronics industry to the field of quantum nano electronics. At the Hitachi Cambridge Laboratory, the Quantum Information Team is tackling this challenge using complementary metal-oxide semiconductor technology, the same transistor technology used in all conventional information processing devices, such as mobile phones, computers, and cars. By using the spin of single electrons trapped in these transistors at very low temperatures, the Quantum Information Team aims to deliver a scalable solution while also reducing the cost of development. For more information on this approach read the following article from Phys.org: Quantum computers could arrive sooner if we build them with traditional silicon technology Moore’s Law Is Replaced by Neven's Law for Quantum Computing Social Innovation Drives Computing Innovations for Powering Good Should You Be Concerned With Quantum Computing in 2020? A Look Into The Future - Ten Years Out Preparing for Post Quantum Encryption A proud part of Code of Conduct © Hitachi Vantara LLC 2021. All Rights Reserved. © Hitachi Vantara Corporation. All Rights Reserved. Powered by Higher Logic
<urn:uuid:513cac6f-fdbc-403a-84ca-d32058fccbc9>
CC-MAIN-2022-49
https://community.hitachivantara.com/blogs/hubert-yoshida/2021/04/07/hitachi-cambridge-labs-tackles-the-challenge-of-building-a-large-scale-quantum-computer?hlmlt=BL
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710926.23/warc/CC-MAIN-20221203075717-20221203105717-00855.warc.gz
en
0.900429
1,644
3.734375
4
Within days of each other back in 1998, two teams published the results of the first real-world quantum computations. But the first quantum computers weren’t computers at all. They were biochemistry equipment, relying on the same science as MRI machines. You might think of quantum computing as a hyped-up race between computer companies to build a powerful processing device that will make more lifelike AI, revolutionize medicine, and crack the encryption that protects our data. And indeed, the prototype quantum computers of the late 1990s indirectly led to the quantum computers built by Google and IBM. But that’s not how it all began—it started with physicists tinkering with mathematics and biochemistry equipment for curiosity’s sake. “It was not motivated in any way by making better computers,” Neil Gershenfeld, the director of MIT’s Center for Bits and Atoms and a member of one of the two teams that first experimentally realized quantum algorithms, told me. “It was understanding whether the universe computes, and how the universe computes.” Computers are just systems that begin with an abstracted input and apply a series of instructions to it in order to receive an output. Today’s computers translate inputs, instructions, and outputs into switches, called bits, that equal either zero or one and whose values control other switches. Scientists have long used computers to simulate the laws of physics, hoping to better understand how the universe works—for example, you can simulate how far a ball will go based on where it starts and how fast it is thrown. But using bits to simulate physics didn’t make much sense to famed physicist Richard Feynman, since the laws of physics at the smallest scale are rooted in a set of rules called quantum mechanics. “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical,” Feynman famously said at a 1981 conference. A small band of scientists theorized about using these rules to create better simulations during the decade following. Instead of switches, their quantum simulation’s bits are the dual particle-waves of quantum mechanics. Each individual quantum bit would still be restricted to two choices, but as waves, they can take on either of these states simultaneously with varying strengths, interacting with one another like ocean waves—either amplifying the strength of certain combinations of choices or canceling combinations out. But once you measure these quantum bits, each one immediately snaps into a single state. Those strengths, or amplitudes, translate into the probability of ending up with each outcome. Through the early 1990s, “people thought that quantum computing was essentially mad, and many had [supposedly] proved that it could never work,” Jonathan Jones, a physics professor at the University of Oxford who was one of the first to run quantum algorithms on a real quantum computer, told me. Mainly, people thought it was just a curiosity created by theoretical physicists who wondered whether they could understand the universe itself in the language of computers. It also seemed that the finickiness of quantum mechanics—the fact that any slight jostle could quickly snap fragile qubits into single-state particles—would make them impossible to realize. Two milestones busted those ideas. Physicist Peter Shor unveiled an algorithm in 1994 that showed that a computer based on qubits could factor large numbers near-exponentially faster than the best bit-based algorithms. If scientists could invent a quantum computer advanced enough to run the algorithm, then it could crack the popular modern-day encryption systems based on the fact that it’s easy for classical computers to multiply two large prime numbers together but very, very hard to factor the result back into primes. The second turning point came in the mid-90s when physicists started developing error correction—the idea of spreading a single qubit’s worth of information across a series of correlated qubits to lessen the errors. But even after that, the field was small, and the physicists we spoke to discussed conferences at which most of the world’s quantum computing scientists could fit in a room together. Quantum computing forerunners like Charlie Bennett, Isaac Chuang, Seth Lloyd, and David DiVincenzo were coming up with lots of new ideas that percolated quickly through the community. Almost simultaneously, several independent groups realized that the medical and biochemistry industry had long been using a quantum computer in research—Nuclear Magnetic Resonance, or NMR spectrometers. NMR, the technology behind MRI, most commonly consists of a molecule of interest dissolved in a liquid solvent, placed in a strong magnetic field. The nuclei of the atoms in these molecules have an innate quantum mechanical property called “spin,” which is essentially the smallest unit of magnetic information, and can be in either of two states, “up” or “down.” These spins align with the direction of the field. In medicine and biochemistry, scientists will hit the molecules with additional smaller oscillating magnetic fields, called radio-frequency pulses, causing the atoms to release characteristic signals that offer physical information about the molecule. Magnetic resonance imaging or MRI machines instead use this signal to create a picture. But the physicists realized that they could treat certain molecules in this magnetic field as quantum computers, where the nuclei served as qubits, the spin states were qubit values, and the radio-frequency pulses were both the instructions and controllers. These are the operations of quantum computers, also called logic gates as they are in classical computers. “In a sense, NMR had actually been ahead of other fields for decades,” said Jones, a biochemist who teamed up with physicist Michele Mosca to perform one of the first quantum calculations. “They had done logic gates back in the 70s. They just didn’t know what they were doing and didn’t call it logic gates.” Physicists including Chuang, Gershenfeld and David Cory released papers detailing how to realize these devices in 1997. A year later, two teams, one with Jones and Mosca and another with Chuang and Mark Kubinic, actually performed the quantum algorithms. The former consisted of cytosine molecules where two hydrogen atoms had been replaced with deuterium atoms—hydrogen with a neutron. The latter used chloroform molecules. They prepared the qubits into initial states, performed a computation by applying a specially crafted radio-frequency pulse, and measured the final states. We don’t often hear about NMR quantum computers today, because even then, physicists knew that the technique had its limits, something all of the physicists I spoke with mentioned. More qubits would mean more specially crafted molecules. The techniques relied on special workarounds such that each additional qubit would make it harder to pick the signal out of the background noise. “No one thought it would ever be used for more than a demonstration,” Jones said. They just weren’t scalable beyond a few qubits. Still, they were important experiments that physicists still talk about today. NMR machines remain crucial to biochemistry and still have a place in quantum technology. But this early work has left an important, indirect impact on the field. The science behind those radio-frequency pulses has lived on in the quantum computers that Google, IBM, and other companies have built in order to control their qubits. Quantum computers running Shor’s algorithm are still decades away even today, but companies have begun unveiling real devices with dozens of qubits that can perform rudimentary and clearly quantum calculations. Charlie Bennet, IBM fellow and quantum computing veteran, explained that these experiments weren’t enormous discoveries on their own, and indeed the NMR community had been advancing its own science and radio-frequency pulses before quantum computing came along. The physicists I spoke with explained that nobody “won” and there was no “race” back in the late 1990s. Instead, it was a transition point along a road of incremental advances, a point in time in which groups of scientists all came to realize that humans had the technology to control quantum states and use them for computations. “Science is always like that. The whole evidence is more important than almost any one paper,” said Bennett. “There are important discoveries—but these rarely occur in single papers.”
<urn:uuid:675d8b2f-2d12-4e6f-8863-447ab008db75>
CC-MAIN-2022-49
https://gizmodo.com/the-unlikely-origins-of-the-first-quantum-computer-1831054476
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711278.74/warc/CC-MAIN-20221208050236-20221208080236-00055.warc.gz
en
0.960391
1,747
3.796875
4
Walk into a quantum lab where scientists trap ions, and you'll find benchtops full of mirrors and lenses, all focusing lasers to hit an ion “trapped” in place above a chip. By using lasers to control ions, scientists have learned to harness ions as quantum bits, or qubits, the basic unit of data in a quantum computer. But this laser setup is holding research back — making it difficult to experiment with more than a few ions and to take these systems out of the lab for real use. Now, MIT Lincoln Laboratory researchers have developed a compact way to deliver laser light to trapped ions. In a recent paper published in Nature, the researchers describe a fiber-optic block that plugs into the ion-trap chip, coupling light to optical waveguides fabricated in the chip itself. Through these waveguides, multiple wavelengths of light can be routed through the chip and released to hit the ions above it. “It's clear to many people in the field that the conventional approach, using free-space optics such as mirrors and lenses, will only go so far,” says Jeremy Sage, an author on the paper and senior staff in Lincoln Laboratory's Quantum Information and Integrated Nanosystems Group. “If the light instead is brought onto the chip, it can be directed around to the many locations where it needs to be. The integrated delivery of many wavelengths may lead to a very scalable and portable platform. We're showing for the first time that it can be done.” Computing with trapped ions requires precisely controlling each ion independently. Free-space optics have worked well when controlling a few ions in a short one-dimensional chain. But hitting a single ion among a larger or two-dimensional cluster, without hitting its neighbors, is extremely difficult. When imagining a practical quantum computer requiring thousands of ions, this task of laser control seems impractical. That looming problem led researchers to find another way. In 2016, Lincoln Laboratory and MIT researchers demonstrated a new chip with built-in optics. They focused a red laser onto the chip, where waveguides on the chip routed the light to a grating coupler, a kind of rumble strip to stop the light and direct it up to the ion. Red light is crucial for doing a fundamental operation called a quantum gate, which the team performed in that first demonstration. But up to six different-colored lasers are needed to do everything required for quantum computation: prepare the ion, cool it down, read out its energy state, and perform quantum gates. With this latest chip, the team has extended their proof of principle to the rest of these required wavelengths, from violet to the near-infrared. “With these wavelengths, we were able to perform the fundamental set of operations that you need to be able to control trapped ions,” says John Chiaverini, also an author on the paper. The one operation they didn't perform, a two-qubit gate, was demonstrated by a team at ETH Zürich by using a chip similar to the 2016 work, and is described in a paper in the same Nature issue. “This work, paired together with ours, shows that you have all the things you need to start building larger trapped-ion arrays,” Chiaverini adds. To make the leap from one to multiple wavelengths, the team engineered a method to bond a fiber-optic block directly to the side of the chip. The block consists of four optical fibers, each one specific to a certain range of wavelengths. These fibers line up with a corresponding waveguide patterned directly onto the chip. “Getting the fiber block array aligned to the waveguides on the chip and applying the epoxy felt like performing surgery. It was a very delicate process. We had about half a micron of tolerance and it needed to survive cooldown to 4 kelvins,” says Robert Niffenegger, who led the experiments and is first author on the paper. On top of the waveguides sits a layer of glass. On top of the glass are metal electrodes, which produce electric fields that hold the ion in place; holes are cut out of the metal over the grating couplers where the light is released. The entire device was fabricated in the Microelectronics Laboratory at Lincoln Laboratory. Designing waveguides that could deliver the light to the ions with low loss, avoiding absorption or scattering, was a challenge, as loss tends to increase with bluer wavelengths. “It was a process of developing materials, patterning the waveguides, testing them, measuring performance, and trying again. We also had to make sure the materials of the waveguides worked not only with the necessary wavelengths of light, but also that they didn't interfere with the metal electrodes that trap the ion,” Sage says. Scalable and portable The team is now looking forward to what they can do with this fully light-integrated chip. For one, “make more,” Niffenegger says. “Tiling these chips into an array could bring together many more ions, each able to be controlled precisely, opening the door to more powerful quantum computers.” Daniel Slichter, a physicist at the National Institute of Standards and Technology who was not involved in this research, says, “This readily scalable technology will enable complex systems with many laser beams for parallel operations, all automatically aligned and robust to vibrations and environmental conditions, and will in my view be crucial for realizing trapped ion quantum processors with thousands of qubits.” An advantage of this laser-integrated chip is that it's inherently resistant to vibrations. With external lasers, any vibration to the laser would cause it to miss the ion, as would any vibrations to the chip. Now that the laser beams and chip are coupled together, the effects of vibrations are effectively nullified. This stability is important for the ions to sustain “coherence,” or to operate as qubits long enough to compute with them. It's also important if trapped-ion sensors are to become portable. Atomic clocks, for example, that are based on trapped ions could keep time much more precisely than today's standard, and could be used to improve the accuracy of GPS, which relies on the synchronization of atomic clocks carried on satellites. “We view this work as an example of bridging science and engineering, that delivers a true advantage to both academia and industry,” Sage says. Bridging this gap is the goal of the MIT Center for Quantum Engineering, where Sage is a principal investigator. “We need quantum technology to be robust, deliverable, and user-friendly, for people to use who aren't PhDs in quantum physics,” Sage says. Simultaneously, the team hopes that this device can help push academic research. “We want other research institutes to use this platform so that they can focus on other challenges — like programming and running algorithms with trapped ions on this platform, for example. We see it opening the door to further exploration of quantum physics,” Chiaverini says.
<urn:uuid:7ce31660-0e6e-4aa8-a36a-fef7823f4cd4>
CC-MAIN-2022-49
https://news.mit.edu/2020/lighting-ion-trap-1104
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711074.68/warc/CC-MAIN-20221206060908-20221206090908-00375.warc.gz
en
0.949671
1,471
3.890625
4
New research from MIT shows that graphene can effectively filter electrons according to the direction of their spin, something that cannot be done by any conventional electronic system. Graphene has become an all-purpose wonder material, spurring armies of researchers to explore new possibilities for this two-dimensional lattice of pure carbon. But new research at MIT has found additional potential for the material by uncovering unexpected features that show up under some extreme conditions — features that could render graphene suitable for exotic uses such as quantum computing. The research is published this week in the journal Nature, in a paper by professors Pablo Jarillo-Herrero and Ray Ashoori, postdocs Andrea Young and Ben Hunt, graduate student Javier Sanchez-Yamaguchi, and three others. Under an extremely powerful magnetic field and at extremely low temperature, the researchers found, graphene can effectively filter electrons according to the direction of their spin, something that cannot be done by any conventional electronic system. Under typical conditions, sheets of graphene behave as normal conductors: Apply a voltage, and current flows throughout the two-dimensional flake. If you turn on a magnetic field perpendicular to the graphene flake, however, the behavior changes: Current flows only along the edge, while the bulk remains insulating. Moreover, this current flows only in one direction — clockwise or counterclockwise, depending on the orientation of the magnetic field — in a phenomenon known as the quantum Hall effect. In the new work, the researchers found that if they applied a second powerful magnetic field — this time in the same plane as the graphene flake — the material’s behavior changes yet again: Electrons can move around the conducting edge in either direction, with electrons that have one kind of spin moving clockwise while those with the opposite spin move counterclockwise. “We created an unusual kind of conductor along the edge,” says Young, a Pappalardo Postdoctoral Fellow in MIT’s physics department and the paper’s lead author, “virtually a one-dimensional wire.” The segregation of electrons according to spin is “a normal feature of topological insulators,” he says, “but graphene is not normally a topological insulator. We’re getting the same effect in a very different material system.” What’s more, by varying the magnetic field, “we can turn these edge states on and off,” Young says. That switching capability means that, in principle, “we can make circuits and transistors out of these,” he says, which has not been realized before in conventional topological insulators. There is another benefit of this spin selectivity, Young says: It prevents a phenomenon called “backscattering,” which could disrupt the motion of the electrons. As a result, imperfections that would ordinarily ruin the electronic properties of the material have little effect. “Even if the edges are ‘dirty,’ electrons are transmitted along this edge nearly perfectly,” he says. Jarillo-Herrero, the Mitsui Career Development Associate Professor of Physics at MIT, says the behavior seen in these graphene flakes was predicted, but never seen before. This work, he says, is the first time such spin-selective behavior has been demonstrated in a single sheet of graphene, and also the first time anyone has demonstrated the ability “to transition between these two regimes.” That could ultimately lead to a novel way of making a kind of quantum computer, Jarillo-Herrero says, something that researchers have tried to do, without success, for decades. But because of the extreme conditions required, Young says, “this would be a very specialized machine” used only for high-priority computational tasks, such as in national laboratories. Ashoori, a professor of physics, points out that the newly discovered edge states have a number of surprising properties. For example, although gold is an exceptionally good electrical conductor, when dabs of gold are added to the edge of the graphene flakes, they cause the electrical resistance to increase. The gold dabs allow the electrons to backscatter into the oppositely traveling state by mixing the electron spins; the more gold is added, the more the resistance goes up. This research represents “a new direction” in topological insulators, Young says. “We don’t really know what it might lead to, but it opens our thinking about the kind of electrical devices we can make.” The experiments required the use of a magnetic field with a strength of 35 tesla — “about 10 times more than in an MRI machine,” Jarillo-Herrero says — and a temperature of just 0.3 degrees Celsius above absolute zero. However, the team is already pursuing ways of observing a similar effect at magnetic fields of just one tesla — similar to a strong kitchen magnet — and at higher temperatures. Philip Kim, a professor of physics at Columbia University who was not involved in this work, says, “The authors here have beautifully demonstrated excellent quantization of the conductance,” as predicted by theory. He adds, “This is very nice work that may connect topological insulator physics to the physics of graphene with interactions. This work is a good example of how the two most popular topics in condensed matter physics are connected to each other.” Reference: “Tunable symmetry breaking and helical edge transport in a graphene quantum spin Hall state” by A. F. Young, J. D. Sanchez-Yamagishi, B. Hunt, S. H. Choi, K. Watanabe, T. Taniguchi, R. C. Ashoori and P. Jarillo-Herrero, 22 December 2013, Nature. The team also included MIT junior Sang Hyun Choi and Kenji Watanabe and Takashi Taniguchi of the National Institute for Materials Science in Tsukuba, Japan. The work was supported by grants from the U.S. Department of Energy, the Gordon and Betty Moore Foundation, and the National Science Foundation, and used facilities at the National High Magnetic Field Laboratory in Florida.
<urn:uuid:5d8e1202-97ba-46cc-81e7-c46b164d810e>
CC-MAIN-2022-49
https://scitechdaily.com/graphene-effectively-filters-electrons-according-direction-spin/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710941.43/warc/CC-MAIN-20221203212026-20221204002026-00336.warc.gz
en
0.938037
1,295
3.84375
4
For the last 100 years, due to its unintuitive nature, quantum physics has captured the imagination of physicists worldwide. Recently, there has been a new push by various companies, such as Google, Microsoft, and IBM, and the developed countries to use these bizarre effects of quantum mechanics for the development of a quantum mechanics based devices, e.g., a quantum computer. In a day-to-day classical computer, the information (e.g., picture of a birthday party) is stored and processed using transistors as “bits” that can have one of the two possible values: “0” (OFF state) or “1” (ON state). The quantum computer takes advantage of entanglement and massive parallelism via the superposition principle to solve problems unmanageable by classical computers. Quantum bits or “qubits” are different: in addition to “0” and “1,” a qubit can also exist in a superposition state. If a classical computer is tasked to figure its way out of a complex maze, it will try every single path sequentially, ruling them all out individually until it finds the right one. In contrast, a quantum computer can go down every branch of the maze at once. Hence a quantum computer reduces computational time by millions of times and at a lower energy cost than a classical computer. The quantum information in a quantum computer can be materialized in different physical forms and converted from one to another without changing its content. The physical implementation choice is left to the “quantum engineer”: either natural microscopic systems such as atoms, ions, photons, electron, and nuclear spins, or more artificial systems such as superconducting qubits. Superconducting qubits are promising candidates for building a quantum computer; they couple very strongly to microwave fields, but exhibit coherence times to tens of microseconds. This limitation allows only a short time window to perform quantum calculations before the whole system decoheres. This time restriction has motivated researchers to look for hybrid quantum systems that increase the coherence time of superconducting qubits by combining them to other quantum systems better protected against decoherence. Researchers are trying to couple superconducting qubits, via a superconducting resonator, to ions, atoms, or spin ensemble. Recently, magnons have been considered as a new candidate for coherent quantum information processing. Magnons are the collective excitation of spins in magnetic materials. Their frequency range lies from GHz to THz. In comparison to the paramagnetic spin ensembles, magnons can exchange information with a much faster speed and for more cycles before losing coherency, while keeping the device dimension small. To implement the high spin density magnetic materials into practical quantum devices, on-chip integration and miniaturization on a nanoscale are required. To achieve this goal, the following fundamental physics and technological issues must be addressed first: 1) Does the magnonphoton coupling scales as we systematically reduce the dimensions of the magnetic element into the nanoscale regime? 2) Are their critical dimensions (either in length, width, or height) of magnetic elements where magnon-photon coupling enhances or reduces non-linearly? 3) Can we tune the magnonphoton coupling via placing arrays of nanomagnets? Specifically, do arrays of nanomagnets on particular lattices or particular magnetic materials allow better magnon-photon coupling? 4) What is the effect on magnon-photon interaction as we vary the fundamental dipolar and exchange interactions among the nanobars? 5) Can we artificially tune the magnon-photon coupling by reprogramming magnetic arrays using a 2-D magnetic field protocol? My goal is to address the above questions using a systematic approach that includes several state-ofthe-art experimental and simulation techniques. We will miniaturize high spin density magnetic thin films using nanofabrication methods. These devices will be incorporated with superconducting microwave resonators to make an on-chip device. We will also construct a novel equipment package that will allow us to study the magnon-photon coupling in arrays of nanomagnets on periodic and quasicrystal lattices as a function of magnetic field, frequency, and temperature. We will utilize a two-dimensional magnetic field protocol to program the magnetic state of nanomagnets to tune the magnon-photon coupling. Furthermore, using the micro-focus Brillouin light scattering technique, we will image the spatial magnon profile. This will give unprecedent mesoscopic understanding of space dependent magnon profile as the magnetic material is miniaturized towards 100 nanometers length scale. Our member Dr. Vinayak Bhat has recived funding from SONATA BIS 10 (ST panel) financed by NCN. Project title is „Study of the effect of the nanostructured periodic and quasicrystal nanomagnet lattices on magnon-photon coupling”
<urn:uuid:c1c39670-4195-40b9-a463-26502d7750bc>
CC-MAIN-2022-49
https://www.magtop.ifpan.edu.pl/sonata-bis-grant-for-magtop-member/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711064.71/warc/CC-MAIN-20221205232822-20221206022822-00137.warc.gz
en
0.902303
1,029
3.65625
4
Cryptocurrency is the digital form of currency that is secured by cryptography. All crypto-related transactions are decentrally controlled. The digital payment system- cryptocurrency doesn’t rely on banks to validate transactions. Peer-to-peer technology makes it possible for anybody, anywhere, to send and receive payments. Payments made using cryptocurrencies do not exist as actual physical coins that can be transported and exchanged; rather, they only exist as digital entries to an online database that detail individual transactions. A public ledger keeps track of all bitcoin transactions that involve money transfers. Cryptocurrencies are digital money kept in digital wallets. Cryptocurrency uses strong coding and encryption to secure end-to-end security for all transactions. strong coding is involved in storing cryptocurrency and all the public ledger to ensure the top safety of all the money. How does cryptocurrency Strong work? A distributed public ledger known as the blockchain, updated and maintained by currency holders, is the foundation of cryptocurrencies. Through a process known as mining, which employs computer power to solve challenging mathematical problems, units of Bitcoin are created. Additionally, users can purchase the currencies from brokers and then store and spend them in digital wallets. When you hold cryptocurrencies, you don’t own anything. What you possess is a key that enables you to transfer a record or a measurement unit between people without using a reliable third party. Although the first cryptocurrency- Bitcoin, has been around since 2009, cryptocurrencies and blockchain technologies are still emerging. New people regularly work on these technologies to make them better financial tools. What is a Bitcoin? Bitcoin was the first-ever cryptocurrency. It was founded in 2009. It’s the most famous and most commonly used cryptocurrency. Satoshi Nakamoto developed Bitcoin. Bitcoin is a digital currency and operates freely using cryptography and the public ledger. The public ledger has all the transactions, and the peer-to-peer network has copies of the public ledger. Anyone with a computer can join this general by setting up the server using the nodes. Instead of relying on a single trust point, such as a bank, these nodes cryptographically decide who will own which bitcoin. Every transaction is shared across nodes and broadcast to the network in a public manner. Miners gather these transactions into a collection called a block, which is added permanently to the blockchain about every 10 minutes; this is the official bitcoin account book. Virtual currencies are held in digital wallets and can be accessed using client software or a variety of internet and hardware solutions, similar to how you would maintain traditional money in a physical wallet. What is the purpose of Bitcoin? Bitcoin was developed as a means of online money transfer. The goal of digital currency was to offer a different form of payment that would function without centralised management but otherwise function similarly to traditional currencies. Every bitcoin transaction is publicly visible and is shared on its node-to-node network. Every transaction is recorded, and the miners add the transactions in a block that further forms the blockchain. At the outset, bitcoin is not a wallet or even currency. It’s a consensual agreement among the network. Bitcoins and cryptocurrencies operate on private keys and passwords and can be transferred easily. Is Bitcoin safe? The US National Security Agency’s SHA-256 algorithm serves as the foundation for the cryptography used by bitcoin. Since more potential private keys would need to be checked than there are atoms in the universe, it is practically impossible to crack them. Although there have been several high-profile instances of bitcoin exchanges being hacked and having money stolen, these firms almost always kept the digital currency for the benefit of their users. In these instances, the websites were hacked and not the Bitcoin network. The fact that bitcoin has no centralised control is a real issue. Anyone making a mistake with a transaction on the wallet has no redress. There is no one to contact if you unintentionally transmit bitcoins to the incorrect person or forget your password. Naturally, it might all be destroyed if practical quantum computing ever becomes a reality. Since quantum computers operate differently from conventional computers, they may be able to do many mathematical computations essential to cryptography in a fraction of a second. What is a polygon? Polygon is a platform that supports different blockchain projects, founded by Jaynti Kanani, Anurag Arjun, Sandeep Nailwal and Mihaela Bjelic. With the symbol MATIC, Polygon is both a cryptocurrency and a platform for connecting and expanding blockchain networks. In 2017, Polygon—”Ethereum’s internet of blockchains”—was introduced as Matic Network. The Polygon platform connects Ethereum-based projects and runs on the Ethereum blockchain. While maintaining the security, interoperability, and structural advantages of the Ethereum blockchain, the Polygon platform can boost a blockchain enterprise’s flexibility, scalability, and sovereignty. MATIC is secured via an ERC-20 token compatible with Ethereum-based digital currencies. Matic is used to governing the polygon network and ensure it’s secure. Polygon network also brings out the limitations of the Ethereum network and gives us better options to operate on with high reliability and fewer transaction fees. What is the purpose of a Polygon? Polygon work on a proof of stake method, which ensures consensus through every block. Polygon allows you to create custom blockchain networks. It bridges the communication between blockchains and Ethereum. It also aids other blockchain networks to be compatible with Ethereum. Cryptocurrencies are the future of digital currency. It is here to stay, and it’s growing exponentially. The world of cryptography has entered so many arenas, proving its security and worth. Before investing in any crypto, be it bitcoin, Ethereum, Litecoin, Ripple etc., do your part of the research and then hop onto the trends. When Bitcoin was founded, it was made to make everyday transactions easier. Although it’s still not widely used, many websites and places accept Bitcoin against the purchase. Websites that accept cryptocurrencies are majorly technology-related sites, cars, insurance, luxury goods, e-commerce etc. But you should also be aware of fraudulent websites as they can be scammers and steal the code for your cryptocurrency.
<urn:uuid:9cd28883-053e-49eb-b0d2-a0fa49292cf3>
CC-MAIN-2022-49
https://giznoise.com/2022/10/30/what-is-cryptocurrency/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710902.80/warc/CC-MAIN-20221202114800-20221202144800-00138.warc.gz
en
0.936598
1,284
3.5
4
In the ancient world, they used cubits as an important data unit, but the new data unit of the future is the qubit — the quantum bits that will change the face of computing. Quantum bits are the basic units of information in quantum computing, a new type of computer in which particles like electrons or photons can be utilized to process information, with both “sides” (polarizations) acting as a positive or negative (i.e. the zeros and ones of traditional computer processing) alternatively or at the same time. According to experts, quantum computers will be able to create breakthroughs in many of the most complicated data processing problems, leading to the development of new medicines, building molecular structures and doing analysis going far beyond the capabilities of today’s binary computers. The elements of quantum computing have been around for decades, but it’s only in the past few years that a commercial computer that could be called “quantum” has been built by a company called D-Wave. Announced in January, the D-Wave 2000Q can “solve larger problems than was previously possible, with faster performance, providing a big step toward production applications in optimization, cybersecurity, machine learning and sampling.” IBM recently announced that it had gone even further — and that it expected that by the end of 2017 it would be able to commercialize quantum computing with a 50-qubit processor prototype, as well as provide online access to 20-qubit processors. IBM’s announcement followed the September Microsoft announcement of a new quantum computing programming language and stable topological qubit technology that can be used to scale up the number of qubits. Taking advantage of the physical “spin” of quantum elements, a quantum computer will be able to process simultaneously the same data in different ways, enabling it to make projections and analyses much more quickly and efficiently than is now possible. There are significant physical issues that must be worked out, such as the fact that quantum computers can only operate at cryogenic temperatures (at 250 times colder than deep space) — but Intel, working with Netherlands firm QuTech, is convinced that it is just a matter of time before the full power of quantum computing is unleashed. “Our quantum research has progressed to the point where our partner QuTech is simulating quantum algorithm workloads, and Intel is fabricating new qubit test chips on a regular basis in our leading-edge manufacturing facilities,” said Dr. Michael Mayberry, corporate vice president and managing director of Intel Labs. “Intel’s expertise in fabrication, control electronics and architecture sets us apart and will serve us well as we venture into new computing paradigms, from neuromorphic to quantum computing.” The difficulty in achieving a cold enough environment for a quantum computer to operate is the main reason they are still experimental, and can only process a few qubits at a time — but the system is so powerful that even these early quantum computers are shaking up the world of data processing. On the one hand, quantum computers are going to be a boon for cybersecurity, capable of processing algorithms at a speed unapproachable by any other system. By looking at problems from all directions — simultaneously — a quantum computer could discover anomalies that no other system would notice, and project to thousands of scenarios where an anomaly could turn into a security risk. Like with a top-performing supercomputer programmed to play chess, a quantum-based cybersecurity system could see the “moves” an anomaly could make later on — and quash it on the spot. The National Security Agency, too, has sounded the alarm on the risks to cybersecurity in the quantum computing age. “Quantum computing will definitely be applied anywhere where we’re using machine learning, cloud computing, data analysis. In security that [means] intrusion detection, looking for patterns in the data, and more sophisticated forms of parallel computing,” according to Kevin Curran, a cybersecurity researcher at Ulster University and IEEE senior member. But the computing power that gives cyber-defenders super-tools to detect attacks can be misused, as well. Last year, scientists at MIT and the University of Innsbruck were able to build a quantum computer with just five qubits, conceptually demonstrating the ability of future quantum computers to break the RSA encryption scheme. That ability to process the zeros and ones at the same time means that no formula based on a mathematical scheme is safe. The MIT/Innsbruck team is not the only one to have developed cybersecurity-breaking schemes, even on these early machines; the problem is significant enough that representatives of NIST, Toshiba, Amazon, Cisco, Microsoft, Intel and some of the top academics in the cybersecurity and mathematics worlds met in Toronto for the yearly Workshop on Quantum-Safe Cryptography last year. The National Security Agency, too, has sounded the alarm on the risks to cybersecurity in the quantum computing age. The NSA’s “Commercial National Security Algorithm Suite and Quantum Computing FAQ” says that “many experts predict a quantum computer capable of effectively breaking public key cryptography” within “a few decades,” and that the time to come up with solutions is now. According to many experts, the NSA is far too conservative in its prediction; many experts believe that the timeline is more like a decade to a decade and a half, while others believe that it could happen even sooner. And given the leaps in progress that are being made on almost a daily process, a commercially viable quantum computer offering cloud services could happen even more quickly; the D-Wave 2000Q is called that because it can process 2,000 qubits. That kind of power in the hands of hackers makes possible all sorts of scams that don’t even exist yet. For example, forward-looking hackers could begin storing encrypted information now, awaiting the day that fast, cryptography-breaking quantum computing-based algorithms are developed. While there’s a possibility that the data in those encrypted files might be outdated, there is likely to be more than enough data for hackers to use in various identity theft schemes, among other things. It’s certain that the threats to privacy and information security will only multiply in the coming decades. In fact, why wait? Hackers are very well-funded today, and it certainly wouldn’t be beyond their financial abilities to buy a quantum computer and begin selling encryption-busting services right now. It’s likely that not all the cryptography-breaking algorithms will work on all data, at least for now — this is a threat-in-formation — but chances are that at least some of them will, meaning that even now, cyber-criminals could utilize the cryptography-breaking capabilities of quantum computers, and perhaps sell those services to hackers via the Dark Web. That NSA document that predicted “decades” before quantum computers become a reality was written at the beginning of 2016, which shows how much progress has been made in barely a year and a half. The solution lies in the development of quantum-safe cryptography, consisting of information theoretically secure schemes, hash-based cryptography, code-based cryptography and exotic-sounding technologies like lattice-based cryptography, multivariate cryptography (like the “Unbalanced Oil and Vinegar scheme”), and even supersingular elliptic curve isogeny cryptography. These, and other post-quantum cryptography schemes, will have to involve “algorithms that are resistant to cryptographic attacks from both classical and quantum computers,” according to the NSA. Whatever the case, it’s certain that the threats to privacy and information security will only multiply in the coming decades, and that data encryption will proceed in lockstep with new technological advances.
<urn:uuid:373d15ab-fda4-4cfd-864e-6fa5ae2312f2>
CC-MAIN-2022-49
https://techcrunch.com/2018/01/05/the-quantum-computing-apocalypse-is-imminent/?ncid=rss
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446708010.98/warc/CC-MAIN-20221126144448-20221126174448-00099.warc.gz
en
0.943892
1,610
3.546875
4
As electronic devices using conventional materials reach their limits, research focus has shifted to the development of exotic materials with properties that can make electronic devices more efficient, lightweight, flexible, cost-effective and smart. Take a look at some promising candidates. Most of us assume that smartphones and laptops will keep getting faster and better. But that progress could come to an end in about a decade. That’s when engineers will hit the limits of cramming atom-scale circuitry onto conventional silicon chips, the brains behind every computing device today. Fortunately, chip market leaders have plenty of ideas to get around that impasse. Their plans begin with refinements to today’s technology and grow steadily more exotic. Companies are investing big in exotic forms of carbon as a way to recraft chips. Graphene, for example, is a sheet of carbon atoms just a single atomic layer thick, arranged in a hexagonal array that looks like a chickenwire fencing. Another is carbon nanotubes, which are like tiny straws made from rolled up graphene sheets. Both forms of carbon could help push miniaturisation further than what’s possible with conventional silicon. And processors could get faster even if they don’t get smaller—a big selling point. Nanotubes could become transistor building blocks, although placing them precisely is a big challenge. Researchers also envision tiny transistors made using graphene, but graphene-based chips will pose challenges. The material conducts electrical current well but doesn’t mirror silicon’s semiconductor properties. One way to keep pushing progress will involve elements drawn from other columns to either side of the group IV column—thus the term III-V materials, pronounced simply ‘three-five.’ With III-V materials, chip manufacturing stays the same but silicon will get new elements layered on top. That will help electrons flow faster, which means less voltage will be needed to get them moving. If the chips need less power, transistors can be smaller and switch faster. Researchers are creating and investigating artificial and unconventional materials with unusual electronic and magnetic properties like superconductors that transport electricity with zero losses, and very thin materials (just two or three atoms thick) that could be incorporated into transistors. The novelty of such materials makes it nearly impossible to anticipate everything that they can do. A researcher can make educated guesses about various properties, but end up seeing something entirely different. A deeper understanding of the material opens the possibility that engineers would be able to route electric currents in quantum computers much like the way they do in conventional electronics through silicon. However, creating high-quality topological insulator materials is a challenge. Since the useful properties occur on the surface, nanoscale ribbons and plates would be ideal to work with because of their large surface area. British researchers won the 2016 Nobel Prize in Physics for their theoretical explanations of strange states (topological phases) of matter in two-dimensional materials. Their work laid the foundation for predicting and explaining bizarre behaviours that experimentalists discovered at the surfaces of materials, and inside extremely thin layers. These include superconductivity—the ability to conduct electricity without resistance—and magnetism in very thin materials. Physicists are now exploring similar states of matter for potential use in a new generation of electronics including quantum computers. And the theories pioneered by the Nobel winners have been extended to develop exciting materials such as topological insulators. Topological insulators are a class of solids that conduct electricity like a metal across their surface but at the same time block the current’s flow like a rubber through their interior. Theoretical physicists predicted their existence in 2006 and experimentalists demonstrated the first such material in 2008. Engineers find a few traits of topological insulators especially exciting. One is that the electrons move in a direction determined by their spin—a quantum-mechanical property that forms the basis of magnetic data storage. Engineers hope to exploit the spin-motion connection to make superfast hard drives. Topological insulators open the door to tailoring topological electronic properties by stacking different thin sheets, or 2D materials. These exotic 2D materials could be used as a platform for energy-efficient computing (spintronics) and to solve today’s intractable challenges with quantum computing. Candidate materials for topological insulators Like graphene, the semi-metal tungsten ditelluride (WTe2) can be prepared in a single monolayer. Tellurium atoms sandwich the transition metal tungsten in each layer. These sandwiched transition metal materials are important for future electronics and photonics. Scientists have predicted that WTe2 in monolayer form has the exotic electronic properties of topological insulators. However, the surface of WTe2 oxidises in air, destroying the electronic properties. Now, researchers have made devices from WTe2 down to a single layer thick, which are air-stable and have good electrical contacts. Surprisingly, they found that in the case of a single layer, the sheet became insulating at liquid nitrogen temperatures when no gate voltage was applied. For large-enough positive or negative contact voltages, the electrical current switched on, as in a transistor. This content was originally published here.
<urn:uuid:bd8d9af7-8c7f-4e4b-afd3-0ea3eae0fb0e>
CC-MAIN-2022-49
https://www.smpstroubleshooting.com/electronics-of-exotic-materials/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710900.9/warc/CC-MAIN-20221202082526-20221202112526-00020.warc.gz
en
0.931054
1,089
3.578125
4
Quantum computing has become a buzzword in the IT industry. Some people think it’ll change how we do computing forever and give us more processing power than we ever imagined. Some fear this new technology might break all current encryption and security. Others are creating sci-fi shows based on quantum computing, like Devs, which appears in this list of our community’s favorite TV shows. But most people, even many developers, aren’t quite sure what quantum computing is. Let’s clear up some of the confusion. Quantum computing terms you need to know Before we get into how quantum computing works, let’s look at some key terms that you’ll need to know to understand the concept. The quantum in quantum computing refers to quantum mechanics. A quantum in physics is the minimum amount of any physical property that can exist. For instance, a photon is a single quantum of light. Quantization of energy and how it affects the interactions between matter and energy is part of the fundamental framework for describing the physical world. Qubit is short for quantum bit — the quantum version of the bit we use in classical computing. Standard bits can only be one of two values: 1 or 0. Qubits, on the other hand, hold a superposition of all possible states. Every quantum state can be represented as a sum of two or more other distinct states, and quantum particles combine all possible states. They remain in all of these states at once until they’re actually observed and measured. Think of a coin flip. Once the coin lands on the ground, it’ll be heads or tails, but while it’s in the air, it still has a chance of being either one. Quantum computers use the concept of superposition to manipulate qubits and affect their probabilities before making a final measurement to get the answer. Entanglement is a process by which quantum particles can link up so that their states stay linked no matter how far apart they are in space. They share a unified quantum state and can exert an influence on each other. By entangling qubits in a quantum computer, more information can be represented simultaneously, giving the quantum computer more computing power and the ability to solve more complicated problems. In a quantum computer, entanglement is a good thing, but interference is bad. Quantum interference is part of a qubit’s natural behavior that can influence the probability of the final measurement of its superposition. Quantum computers try to reduce interference as much as possible to ensure more accurate results. How does quantum computing work? A quantum computer has three main parts. The first part is the structure that holds the qubits used for computation. These qubits must be stored in a way that minimizes quantum interference. In some quantum computers, superfluids chill the qubit housing to a hundredth of a degree Celsius above absolute zero to keep the qubits stable. Other quantum computers use a vacuum to help with qubit cohesion and minimize interference between them. The second part is a mechanism for transferring information to the qubits. To use them for computations, their behavior must be controlled so they can hold, change, and read information. There are a few ways to do this. Lasers, microwaves, and voltage are the most common. The third and final major part of a quantum computer is a standard computer where the code written for the quantum computer is run. It interfaces with the control mechanism, which sends instructions to the qubits. Where can quantum computing be used? Quantum computing is still in its early stages, and it’s not quite ready to be used in everyday businesses. Still, some companies are starting to find new uses for the technology. Most of the work in quantum computing is currently being done by scientists and quantum computing experts who create proof-of-concept applications and test them on a small scale to help identify future uses for the technology. That way, they’ll be ready when quantum hardware develops to the point that it’s practical for more uses. Also, while a quantum computer can do certain things many magnitudes faster than a classical computer, they don’t do everything quicker and aren’t practical for some computational problems. Here are some of the many industries where quantum computing will have the biggest impact. The power of quantum computers threatens to make current cryptography techniques obsolete, such as RSA encryption, which is used to secure much of the sensitive data in the digital world. The good news is that there are already companies working on new cryptography techniques that even quantum computers can’t crack. Machine learning is changing many things about our world, but running machine learning algorithms on traditional computers takes a lot of time and resources. Scientists and Quantum Computing Researchers are looking into new ways to make machine learning faster and more efficient using quantum computers. Quantum computers have many uses in the healthcare industry. They simulate chemical reactions much faster than standard computers, and they’re also used for protein folding, where they help speed up the creation of new drugs. Quantum computing is also used in fintech, where its power makes parsing massive amounts of financial data quicker and model creation more accurate. It can also be used in fraud detection and portfolio risk optimization. Quantum computers are good at optimization. There are many challenges involved in supply chains and international shipping routes that can take a standard computer literally years to solve, but a quantum computer can solve in only minutes. Programming languages and SDKs used in quantum computing The programming languages used in quantum computing may have a similar syntax to those used in standard programming, but they were created specifically to handle the quantum computing environment. But that doesn’t mean you can’t still use standard programming languages. There are high-level SDKs (Software Development Kits) written in languages like Python that allow you to branch into quantum computing without needing to learn a new language. Here are some of the many programming languages and SDKs used in quantum computing: - QCL: QCL (Quantum Computing Language) is one of the first programming languages used for quantum computing. Its syntax resembles the C programming language, and its data types are similar to the primitive data types in C. - Q: Q was the second programming language implemented in quantum computers. It was designed as an extension of C++, so C++ developers can start working with it quickly. - OpenQASM: OpenQASM (Open Quantum Assembly Language) is a low-level language released by IBM for use with quantum computers. - Q#: Q# is an open-source quantum programming language offered by Microsoft. It has some features that developers who know the Python, C#, and F# programming languages will recognize. - Silq: Silq is an open-source high-level programming language written in the D programming language. It’s available on Github and is relatively new. The first version was published in 2020. - Cirq: Cirq is a Python library created by Google for writing, manipulating, and optimizing quantum circuits. Cirq abstracts away many of the low-level details of quantum hardware in a language familiar to many developers. - Qiskit SDK: Qiskit is a software development kit created specifically for working with the OpenQASM programming language and IBM Q quantum processors. It’s written in Python, so developers don’t have to have high-level knowledge of quantum hardware to use it. - Braket SDK: The Braket SDK is yet another quantum computing SDK written in Python that works with Amazon’s proprietary Braket quantum computing platform. How to get started in quantum computing As we said, quantum computing isn’t yet practical enough to be used in the average business. So you can’t get a job writing code for quantum computers yet, unless the job is with a business currently experimenting with the technology or building their own quantum computers. Still, you can experiment with quantum computer coding right now. Here are four places you can do that: - Amazon Braket: Amazon will give you one free hour per month to experiment with their quantum computing platform, and it provides an SDK written in Python to interact with the Braket platform so you can write quantum code in a familiar programming language. - IBM Quantum: You can also sign up for an account with IBM to run experiments on their quantum computing platform. You can write your code in Python here using the Qiskit SDK. - Azure Quantum: You can experiment with the quantum computers that Microsoft has access to, and when you sign up, you can get a free $200 credit. - DWave Leap: DWave also provides developers with limited free access to their quantum computing platform. Python is a good choice if you’re ready to jump into quantum computing today since Circ, the Qiskit SDK, and the SDK for Amazon’s Braket are based on the language. Check out our Learn Python 3 course to learn what you need to know to get started. Or, if you’d rather work with some of the low-level languages used for quantum computing, try Learn C++.
<urn:uuid:f82b6987-f9df-453f-8ee6-21df7c56f7f2>
CC-MAIN-2022-49
https://skillacademy.com.ng/what-is-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711336.41/warc/CC-MAIN-20221208114402-20221208144402-00301.warc.gz
en
0.923202
1,947
3.625
4
The astrolabe, magnetic compass, and telescope were the five significant advances of the Age of Exploration. compass with magnets A compass is a tool for navigation and geographic orientation that displays the cardinal directions. It usually consists of a magnetized needle or other device that may rotate to align itself with magnetic north, such as a compass card or compass rose. Compass – Wikipedia, caravel, sextant, and Mercator’s projection https://en.wikipedia.org wiki CompassCompass – Wikipedia, caravel, sextant, and Mercator’s projection Similarly, What technology helped the Europeans? Steel steamships (along with other technology) aided European empires in expanding inland in Africa and Asia, and the discovery of quinine made exploration of the former continent much simpler. Also, it is asked, What were the 3 technologies that made European exploration easier? Answers include: mapmakers improved their processes and generated more accurate maps; the astrolabe aided navigation; and the three-masted caravel enabled ships to go farther. Secondly, How did technology play a role in European exploration? European expeditions and discovery were also aided by new technologies. Ocean currents and latitude lines were better shown on better maps. Navigation was enhanced by inventions such as the astrolabe and magnetic compass. Also, What navigation technology was used by explorers? Astrolabe. Many European explorers, including Columbus and Magellan, employed the astrolabe as one of their most essential navigational aids. People also ask, What technology did the European settlers bring to America? Several technical breakthroughs, including as compasses, caravels, and astrolabes, greatly aided European colonization of the Americas. It influenced economic growth by allowing the creation of large-scale trading networks between the Old and New Worlds. Related Questions and Answers Why did Europe have better technology? It may be related to culture, but it is more typically due to other variables like location, environment, available resources, or even population size. Many technical achievements are the result of a lengthy trial and error process involving a great deal of luck and chance. What were new technologies 1450 1750? Early Modern Period (1450-1750) Invented in China during the Han Dynasty, the sternpost rudder improves steering. Sails made of lateen may sail in any direction, independent of the wind. Latitude is the measurement of the distance between the sun and the stars above the horizon, as measured by an astrolabe. Chinese magnetic compass – orientation without sight of land What technologies led to the Age of Exploration and where did they originate? The Age of Exploration took place during the 1400s and 1500s, during the Renaissance, and it ushered in a spirit of exploration and creativity throughout Europe. The compass, the astrolabe, and innovative ships like the caravel were some of the advancements that made the Age of Exploration feasible. What helped European exploration success? A Faster Eastbound Route However, commerce was the most significant motivator for exploration. The historic expedition of Marco Polo to Cathay signified Europe’s “discovery” of Chinese and Islamic cultures. Traders were drawn to the Orient, and exotic goods and money flooded into Europe. Which technological advancements made it possible for European sailors to reach the Far East? More precise maps, better ships, and better navigation instruments like the compass or the astrolabe were three technical breakthroughs that helped make European voyages of discovery feasible. The astrolabe was used to calculate the locations of the stars. This allowed sailors to calculate out how much leeway they had when at sea. How did the astrolabe help European explorers? The astrolabe, a portable gadget used by sailors to assist them find their way, was one of them. The astrolabe assisted in determining latitude, which is an essential aid in navigation, by measuring the distance between the sun and stars above the horizon. Which technological improvements led to the era of European exploration? The astrolabe, magnetic compass, caravel, sextant, and Mercator’s projection were the five significant breakthroughs of the Age of Exploration. What technology did the First Nations use? Traditionally, First Nations societies made tools out of natural materials for hunting, fishing, and textile production. Stone, bone, antlers, teeth, and wood were used to make arrow and spearheads by the Dakelh. Caribou skin and plant bark were weaved together to create beaver nets. What was the role of improved technology in European exploration in the 15th and 16th centuries? What impact did advancements in technology play in European exploration throughout the 15th and 16th centuries? Ships were able to go quicker because to the steam engine. At sea, the mariner’s astrolabe calculated latitude. The ships were pushed farther by African slave rowers. Is Europe technologically advanced? As is customary, East Asian and European nations dominated the top slots in the rankings. Ten of the top 20 nations were in Europe, and five were in East Asia, indicating that the competition for technical growth has a major regional component. How advanced is Europe in terms of technology? Europe is currently slipping behind not just the United States and Japan, but even China in terms of technical innovation. China has surpassed the EU in terms of R&D spending, accounting for 2.1 percent of GDP. There isn’t a single European company among the world’s top 15 digital enterprises today. How did Europe advance faster? It seems to imply that the industrial revolution in Western Europe was triggered by increased productivity and intense expansion via the use of machines. This makes sense, and it’s likely one of the reasons why Western Europe “advanced” more quickly in the previous several centuries. What were some examples of technology that developed during the time period of 1450 1750? Following interactions with Chinese traders, gunpowder, papermaking, block printing, and the compass were all introduced to European civilization. What technological innovations helped the Europeans to create their maritime empires and how? What technologies aided Europeans in establishing their maritime empires? New ships and other marine technologies such as lateen sails, updated charts, and an astrolabe are all being developed. What made European Exploration possible? God, money, and glory are the three main motivations for European exploration and colonization of the New World, according to historians. How did the sextant help European explorers? Navigators and surveyors used sextants to measure the angle between two objects. They were used at sea to calculate the angle between a celestial object and the horizon, such as the sun, moon, planets, and stars. What were the three main tools of navigation that led to the Age of Exploration? Lateen sails, the astrolabe, and the magnetic compass are three instruments that are particularly important during this era. What 4 tools and inventions for navigation improved during the Renaissance? Navigation required the use of tools such as an hourglass, a quadrant, a compass, and a nautical chart. What is traditional and Indigenous technology? Modern technology makes use of natural resources, but indigenous technology makes use of other materials. For example, instead of utilizing industry coal and lime for housing building, you may use charcoal and seashell mortar. Cite. What did First Nations invent? Inventions are a fun fact. Other First Nations innovations include canoes and kayaks, darts, lacrosse (a precursor to hockey), petroleum jelly, and cough syrup. Who has the best technology in the world? According to a UN survey, Finland is the world’s most technologically sophisticated nation. According to a recent assessment published by the United Nations development program, Finland is the world’s most technologically advanced nation, ahead of the United States (UNDP). Is Europe behind in tech? However, Europe is now lagging behind in critical technical infrastructure such as semiconductors and ultrafast telecommunications networks. Cisco in the United States and Huawei in China have constructed the infrastructure that powers the internet for Europe’s 700 million people. What is the most advanced technology? 9 New Technology Trends to Watch in 2022 Machine Learning and Artificial Intelligence (AI). Automation of Robotic Processes (RPA) Edge Computing is a term that refers to the use of Quantum computing is a term that refers to the use of quantum The terms “virtual reality” and “augmented reality” are used interchangeably. This Video Should Help: The “in what two ways did technological innovations lead to the age of exploration” is a question that has many answers. The main answer is that Europeans used technology in order to explore new lands and find new resources. - what role did mercantilism play in european countries desire to explore - navigation technology in the 1500s and the age of exploration - what were some immediate and some long-term outcomes of columbus’ voyage - technology exploration - how technology enabled european explorers to navigate to the new world.
<urn:uuid:523c1ec4-466d-4053-be76-d70555117909>
CC-MAIN-2022-49
https://zplug.sh/what-technology-did-european-explorers-use/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711016.32/warc/CC-MAIN-20221205100449-20221205130449-00661.warc.gz
en
0.950108
1,914
3.609375
4
Over the past several millennia, humanity has transformed itself from a species of hunter-gatherers into a global and interconnected civilisation. Researchers have long debated the traits of our ancestors which could have allowed such a dramatic metamorphosis to take place, but perhaps the most widely agreed-upon theory has been our ability to convey complex ideas to each other through speech and writing – the rich communication tool which we call human language. As we exchange knowledge with each other in this way, our embodied perception adopts a constant state of change. Dr Sánchez-Flores proposes that this process is rooted in a phenomenon known as ‘autopoiesis’ – first coined by Chilean biologists Humberto Maturana and Francisco Varela in 1972 to describe the self-maintaining chemistry of living cell populations. For humans, this means that we are continually shaped by what we perceive through our nervous systems, while at the same time, we continually shape our surrounding environments through our embodied exchanges. We express those exchanges to one another using language. “Human language is a by-product of the relationships that human beings build with one another and on which they depend as organisms to survive, thrive, and emerge as persons”, Dr Sánchez-Flores explains. “Human beings as living organisms engage in autopoiesis and the languages we produce are both enabled by our human biology and enable our own autopoiesis.” Yet on a deeper level, researchers in the past haven’t widely examined how the autopoietic nature of our use of language is connected with wider concepts that can help us understand how life sustains itself. In 1949, American philosopher John Dewey proposed in his book “Knowing and the Known” that the exchanges which take place between all living organisms as well as with their environment are ‘trans-actional’. This means that all reciprocal activity between organisms can be described as mutual, simultaneous exchanges, through which all parties involved are changed in some way. The idea is distinctly opposed to ‘interaction’, in which physical objects are independent of each other, and don’t do anything unless they are acted upon by other objects or forces. Through her research, Dr Sánchez-Flores argues that this all-encompassing idea of the nature of exchanges between living systems must also apply to the language we use. “According to Dewey, a trans-actional presentation of knowledge means that everything that we seek to explain as observers exists in continuity with everything else”, she describes. “I propose that the autopoietic conception of language is eminently trans-actional in this way. Seeing everything that we want to explain in continuity with everything else discloses the realm of simultaneity.” Such a concept appears to be at odds with the classical view of the universe. According to earlier thinkers like Newton and Descartes, processes can only take place if they were triggered by separate, previous events. However, the trans-actional, autopoietic interpretation of language isn’t without a physical basis. For a better comparison, Dr Sánchez-Flores looks to a more recent interpretation of physics in which the chain of cause-and-effect is no longer set in stone: the quantum realm. Human language is a by-product of the relationships that human beings build with one another and on which they depend as organisms to survive, thrive, and emerge as persons. Quantum entanglement as trans-actional presentation of knowledge Perhaps one of the most famous and mind-bending consequences of quantum physics is the principle of entanglement, which describes how the fate of one particle can entirely depend on that of another to which it is connected. This would mean that when a scientist observes the state of one particle, the result will determine the observed state of its entangled partner at exactly the same time, even if they are separated by large distances. Additionally, the presence of scientific observation itself alters the outcome of the experiment. Dr Sánchez-Flores believes that this strange, yet experimentally proven phenomenon is the clearest example of a trans-actional presentation of knowledge. It exemplifies how the notion of independent physical bodies acting on each other and the objective/subjective duality are useful myths or illusions of the Cartesian worldview that is most prevalent today. As environments present organisms with new problems, their resulting actions to find a solution will change the organisms themselves and their trans-actions will trigger simultaneous change in their environment in turn. “Autopoietic living systems are, at the same time, organisationally closed and structurally coupled to their environment”, explains Dr Sánchez-Flores. “Both Maturana and Dewey see the scientific observer as an organism itself seeking to solve a problem that is underpinned by the organism’s experience and by its need to find equilibrium in its environment.” Simultaneous closure and coupling in body-sized organisms help us visualize how everything is connected to everything else, similarly to quantum-sized particles where this kind of entanglement occurs at a subatomic level. Overcoming human barriers Human language as trans-actional autopoiesis could have profound implications for the role which language plays in shaping our understanding of human existence. In comparison to the surrounding environment of a living organism, language represents a trans-actional autopoietic environment which spans the many countless groups we have divided ourselves into over the course of history. Unfortunately, we have become all too familiar with the damaging consequences of the many disagreements, misunderstandings, and prejudices which occur between these groups. Dr Sánchez-Flores argues that we would be better equipped to heal these divides if we better understood the role which language plays in shaping our species as a whole. Instead of passively observing our surroundings, she proposes, every one of us actively participates in the development of the world as we perceive it as we exchange knowledge with each other. In turn, knowledge of the continually re-shaping world in awareness of simultaneity opens up the possibility of acknowledging the damaging and violent effects of in-group/out-group human barriers in order to heal them. “This reinforces and supports a trans-actional presentation of knowledge with cosmopolitan possibilities, where human beings are aware of their need for and dependence on one another beyond artificially created borders – such as tribes, ethnic groups, races, disciplines, and nations”, Dr Sánchez-Flores illustrates. “From this, human beings can be made aware of the vital and essential way in which we are all connected to one another, to non-human organisms, and to our environment.” From the immediate threat of a worldwide pandemic to the long-term consequences of a heating climate, a thorough understanding of the role that language plays in shaping our existence has never been more important. By viewing the role of language through the lens of trans-actional autopoiesis, Dr Sánchez-Flores believes that our systems of governance would be far better equipped to understand and face these global challenges, and to account for the widely varied needs of the different groups these efforts involve. This would ultimately provide a strong basis for making decisions based on reason, compassion, and equity, all while accounting for a diverse range of languages, worldviews, and ideologies. Tackling global challenges Dr Sánchez-Flores believes her ideas come at a crucial crossroads in the story of our species – in which humanity as a whole faces a set of challenges which are more global and all-encompassing than any in its history. From the immediate threat of a worldwide pandemic to the immeasurable long-term consequences of a heating climate, a thorough understanding of the complete role that language plays in shaping our existence has never been more important. She concludes, “in the current COVID-19 world-pandemic, it is urgent for the human species that a trans-actional presentation of knowledge becomes the new normal, just as caring and compassion have become constant sources of inspiration during this crisis. This cannot be postponed due to the probable emergence of new pandemics and other climate crises that threaten the very existence of the human species.” Ultimately, Dr Sánchez-Flores’ ideas clearly show that just as the exchange of knowledge has enabled us to thrive as a species, it is now the ultimate toolset for dealing with these existentially daunting problems. What steps do you think world governments could take to implement your ideas into their decision-making processes? <>Modern states represent one more stakeholder in the realm of global or transnational governance – albeit an essential one as a source of legitimate law. Transnational governance structures may include governments and their agents, but also social movements, Indigenous peoples, grassroots organisations, powerful individuals, as well as corporations. COVID-19 has produced awareness that human beings are connected to each other and their environment, and that borders can be futile. To respond to future pandemics and crises, it is essential to strengthen existing transnational structures of governance to produce essential care and services for people around the world, especially for the most marginalised.
<urn:uuid:0ff67525-f3f5-4554-a8d8-6a803a57d7af>
CC-MAIN-2022-49
https://researchoutreach.org/articles/trans-actional-autopoiesis-relational-view-human-language/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711336.41/warc/CC-MAIN-20221208114402-20221208144402-00301.warc.gz
en
0.950864
1,925
3.625
4
Quantum computers are making all the headlines these days, but quantum communication technology may actually be closer to practical implementation. In a bid to hasten its arrival, researchers have now mapped out the path to a quantum internet. The building blocks for these emerging technologies are more or less the same. They both use qubits to encode information—the quantum equivalent to computer bits that can simultaneously be both 1 and 0 thanks to the phenomena of superposition. And they both rely on entanglement to inextricably link the quantum states of these qubits so that acting on one affects the other. But while building quantum computers capable of outperforming conventional ones on useful problems will require very large networks of qubits, you only need a handful to build useful communication networks. And we’re already well on the way. In a review article in Science, researchers from the University of Delft in the Netherlands outlined six phases of development towards a global network of quantum-connected quantum computers and point out that we’re already on the bottom rung of that ladder. “We are now at an exciting moment in time, akin to the eve of the classical internet,” the researchers wrote. “Recent technological progress now suggests that we may see the first small-scale implementations of quantum networks within the next five years.” The main advantages of a quantum communication network over a conventional one are speed and security. Entanglement makes it possible to communicate instantly across arbitrarily large distances in principle. No matter how far apart you put two entangled qubits, acting on one will have an instant and measurable impact on the other. It’s also essentially impossible to eavesdrop on a quantum conversation. Under quantum mechanics, if you read the quantum state of an object it changes that quantum state, which means the act of intercepting any message encoded in quantum states will immediately change the content of the message. But the same property that makes quantum communication intrinsically secure also poses a major challenge. It means qubits can’t be copied or amplified, two essential ingredients of classical communication systems. Nonetheless, working quantum “trusted repeater networks” are already in operation, which the researchers identify as the first step on the way to a full quantum internet. These networks feature nodes that can encode and decode qubits, which are then sent across optical cables or potentially beamed down from space by a satellite. But because quantum signals degrade the further they travel, it’s necessary to pass messages from node to node to cover longer distances. Each of these handovers is secure, but if two distant nodes need to communicate, then all the nodes in between know the content of the message, and so must be trusted if the message is to remain secure. To reach the next stage we will need to develop reliable quantum repeaters, the researchers said. This is a device that is able to establish entangled qubits with each node and then rely on quantum teleportation to effectively swap entanglements around so that the two nodes are entangled. A network connected by these kinds of repeaters would allow any node to securely communicate with any other without having to trust any of the intermediaries. At both these stages, the principle use would be quantum key distribution, which allows two nodes to securely share an encryption key in a way that can’t be eavesdropped on, which can then be used to decode encrypted messages sent via conventional communication channels. The process of entangling distant qubits is hit and miss at the minute, though, so the next stage will be to create a network that’s able to create entanglements on demand. The main advantage of this kind of “entanglement distribution network” is that it will make the network device-independent, according to the researchers. After that, the development of quantum memory will allow much more complicated communication protocols that require quantum information to be stored while further communication goes on. This is a major challenge, though, because quantum states rapidly degrade through a process called decoherence. Most technology proposals only hold their states for seconds or fractions of a second, which poses problems for a network whose communication times are longer than that. But if it could be realized, it would make it possible for simple quantum nodes to send computations to a quantum computer on the network, potentially creating a kind of quantum cloud. It could also make it possible to do things like synchronize distant telescopes to create a single “super telescope.” Ultimately, the goal is to create a network of fully–connected quantum computers. The first phase of that will be a “few-qubit fault-tolerant network,” in which the quantum computers at each node will not yet be large enough to out-do standard computers. Nonetheless, the fact that they incorporate fault tolerance will mean they will carry out relatively complex computation and store quantum data for significant amounts of time. And the final stage will come when these quantum computers finally surpass their conventional cousins, making it possible to create distributed networks of computers capable of carrying out calculations that were previously impossible, and instantly and securely share them around the world. The authors noted that there’s a long road ahead. We need better ways of encoding, storing, and transmitting quantum information, and perhaps even more importantly, we need to build quantum equivalents of our internet communication protocols, something almost entirely lacking today. But they’re bullish that the first multinode quantum networks will be appearing in the next few years, which will make it possible to test all these ideas and hopefully turbocharge development of a true quantum internet.
<urn:uuid:6a8b6cd1-2f3f-44af-aad2-0a890d6275e3>
CC-MAIN-2022-49
https://singularityhub.com/2018/10/22/from-quantum-computing-to-a-quantum-internet-a-roadmap/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710764.12/warc/CC-MAIN-20221130124353-20221130154353-00462.warc.gz
en
0.919626
1,151
3.609375
4
The technology that allowed Marty McFly to travel back in time in the 1985 movie Back to the Future was the mythical flux capacitor, designed by inventor Doc Brown. We’ve now developed our own kind of flux capacitor, as detailed recently in Physical Review Letters. While we can’t send a DeLorean car back in time, we hope it will have important applications in communication technology and quantum computing. How did we do it? Well it’s all to do with symmetry. There are many kinds of symmetry in science, including one that deals with time reversal. Time reversal symmetry is a complex sort of symmetry that physicists like to think about, and relies on the imaginary as much as the real. Suppose you make a movie of an event occurring. You could then ask: “If I edited the movie to run backwards, and showed it to my friends, could they tell?” This might seem obvious: people don’t usually walk or talk backwards; spilt milk doesn’t spontaneously jump back into its carton; a golf ball doesn’t miraculously launch backwards from the fairway, landing perfectly balanced on the tee at the same moment as the club catches it. But at a microscopic level, the story is not that clear. The collision of two billiard balls looks pretty similar in reverse; even more so for the collision of two atoms. A beam of light travelling in one direction obeys exactly the same laws of physics as a beam of light travelling in the opposite direction. Indeed, the basic equations of physics look essentially the same if we replace time with its negative. This mathematical transformation reverses the flow of time in our equations. Since the microscopic laws of physics appear to be unchanged under this mathematical transformation, we say the universe possesses time reversal symmetry, even though we cannot actually reverse time in reality. Unlike Doc Brown, we can’t make the clock tick backwards. There is a conceptual conflict here. At the macroscopic scale, the entropy of the universe — a measure of disorder or randomness — always increases, so that there is an arrow of time. This is obvious in our everyday experience: a scrambled egg is not reversible. How does this irreversiblity emerge from microscopic laws that are reversible? This remains a mystery. The circulator circuit Microscopic reversibility presents an important technological challenge. It complicates the diversion of electronic and radio signals around a circuit. There are various applications where engineers want electromagnetic signals (such as light or radio waves) in a circuit to behave a bit like cars around a roundabout. This is pictured below: a signal entering port A of the device should be directed to port B; a signal entering at B should go to port C; and a signal entering port C should be directed to port A, clockwise around the device. One way to do this is to use a network of amplifiers to switch signals as desired. But there is a profound result in quantum mechanics (the “no cloning theorem”) that means that amplification must always add noise, or randomness, to the signal. Sorry audiophiles: a perfect amplifier is impossible. If the signal is extremely weak, so that additional noise is intolerable, then noiseless circulation is accomplished with a device called a circulator. Such devices are used to separate very weak signals going to and from sensitive electronics, including in radar receivers, or in existing and future quantum computers. It turns out a device like this must locally break time reversal symmetry. If we made a movie of the signals coming and going from the circulator, and ran the movie backwards, it would look different. For example, we would see a signal entering port B and leaving via port A, rather than via C. But most devices in a quantum research laboratory, such as mirrors, beam splitters, lasers, atoms do not break time reversal symmetry, so cannot be used as circulators. Something else is needed. The practical way to break time reversal symmetry for real devices is to introduce a magnetic field. Like a rotating vortex in water, magnetic fields have a circulation, since they arise from electrical currents circulating in an electrical loop. The magnetic field defines a direction of rotation (clockwise or counterclockwise) for electrically charged particles and thus for electrical signals. So when physicists say that a device breaks time reversal symmetry, they usually mean that there is a magnetic field about somewhere. Commercial circulators are an anomaly in the world of electronics. Unlike transistors, diodes, capacitors and other circuit elements, basic materials science means that commercial circulators have not been miniaturised, and are still the size of a coin. Building them into large-scale integrated microelectronic circuits is therefore a challenge. This will become an increasing problem as we try to fit thousands of qubits on a quantum computer chip, each requiring its own circulator to enable control and read-out. Our quantum flux capacitor We have developed a new way of building micrometer-sized circulators that can be fabricated on a microchip. We figured out how to integrate magnetic flux quanta — the smallest units of magnetic field — with microfabricated capacitors and other superconducting circuit elements, so that time-reversal symmetry can be broken. This led to our new circulator proposal. As with conventional circulators, there is a magnetic field present. But because we can use just one magnetic flux quantum, our design can be microscopic. Sadly for history buffs, our design won’t help much in your DeLorean time machine: it doesn’t reverse time. But its magnetic field does break time-reversal symmetry as advertised and we expect these devices will find applications in future quantum technologies. Even sooner, they may help in high-bandwidth communications environments like mobile phone base stations in very dense populations, or for ultra-high sensitivity radar where every photon of the electromagnetic field counts.
<urn:uuid:4aa42219-e4be-4790-b55f-eaa66fa80daa>
CC-MAIN-2022-49
https://theconversation.com/weve-designed-a-flux-capacitor-but-it-wont-take-us-back-to-the-future-92841
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711045.18/warc/CC-MAIN-20221205200634-20221205230634-00023.warc.gz
en
0.929747
1,236
3.875
4
USC scientists have demonstrated a theoretical method to enhance the performance of quantum computers, an important step to scale a technology with potential to solve some of society’s biggest challenges. The method addresses a weakness that bedevils performance of the next-generation computers by suppressing erroneous calculations while increasing fidelity of results, a critical step before the machines can outperform classic computers as intended. Called “dynamical decoupling,” it worked on two quantum computers, proved easier and more reliable than other remedies and could be accessed via the cloud, which is a first for dynamical decoupling. The technique administers staccato bursts of tiny, focused energy pulses to offset ambient disturbances that muck sensitive computations. The researchers report they were able to sustain a quantum state up to three times longer than would otherwise occur in an uncontrolled state. “This is a step forward,” said Daniel Lidar, professor of electrical engineering, chemistry and physics at USC and director of the USC Center for Quantum Information Science and Technology (CQIST). “Without error suppression, there’s no way quantum computing can overtake classical computing.” The results were published today in the journal Physical Review Letters. Lidar is the Viterbi Professor of Engineering at USC and corresponding author of the study; he led a team of researchers at CQIST, which is a collaboration between the USC Viterbi School of Engineeringand the USC Dornsife School of Letters, Arts and Sciences. IBM and Bay Area startup Rigetti Computing provided cloud access to their quantum computers. Quantum computers are fast, but fragile Quantum computers have the potential to render obsolete today’s super computers and propel breakthroughs in medicine, finance and defense capabilities. They harness the speed and behavior of atoms, which function radically different than silicon computer chips, to perform seemingly impossible calculations. Quantum computing has the potential to optimize new drug therapies, models for climate change and designs for new machines. They can achieve faster delivery of products, lower costs for manufactured goods and more efficient transportation. They are powered by qubits, the subatomic workhorses and building blocks of quantum computing. But qubits are as temperamental as high-performance race cars. They are fast and hi-tech, but prone to error and need stability to sustain computations. When they don’t operate correctly, they produce poor results, which limits their capabilities relative to traditional computers. Scientists worldwide have yet to achieve a “quantum advantage” – the point where a quantum computer outperforms a conventional computer on any task. The problem is “noise,” a catch-all descriptor for perturbations such as sound, temperature and vibration. It can destabilize qubits, which creates “decoherence,” an upset that disrupts the duration of the quantum state, which reduces time a quantum computer can perform a task while achieving accurate results. “Noise and decoherence have a large impact and ruin computations, and a quantum computer with too much noise is useless,” Lidar explained. “But if you can knock down the problems associated with noise, then you start to approach the point where quantum computers become more useful than classic computers.” USC research spans multiple quantum computing platforms USC is the only university in the world with a quantum computer; its 1098-qubit D-Wave quantum annealer specializes in solving optimization problems. Part of the USC-Lockheed Martin Center for Quantum Computing, it’s located at USC’s Information Sciences Institute. However, the latest research findings were achieved not on the D-Wave machine, but on smaller scale, general-purpose quantum computers: IBM’s 16-qubit QX5 and Rigetti’s 19-qubit Acorn. To achieve dynamical decoupling (DD), the researchers bathed the superconducting qubits with tightly focused, timed pulses of minute electromagnetic energy. By manipulating the pulses, scientists were able to envelop the qubits in a microenvironment, sequestered – or decoupled – from surrounding ambient noise, thus perpetuating a quantum state. “We tried a simple mechanism to reduce error in the machines that turned out to be effective,” said Bibek Pokharel, an electrical engineering doctoral student at USC Viterbi and first author of the study. The time sequences for the experiments were exceedingly small with up to 200 pulses spanning up to 600 nanoseconds. One-billionth of a second, or a nanosecond, is how long it takes for light to travel one foot. For the IBM quantum computers, final fidelity improved threefold, from 28.9 percent to 88.4 percent. For the Rigetti quantum computer, final fidelity improvement was a more modest 17 percent, from 59.8 to 77.1, according to the study. The scientists tested how long fidelity improvement could be sustained and found that more pulses always improved matters for the Rigetti computer, while there was a limit of about 100 pulses for the IBM computer. Overall, the findings show the DD method works better than other quantum error correction methods that have been attempted so far, Lidar said. “To the best of our knowledge,” the researchers wrote, “this amounts to the first unequivocal demonstration of successful decoherence mitigation in cloud-based superconducting qubit platforms … we expect that the lessons drawn will have wide applicability.” High stakes in the race for quantum supremacy The quest for quantum computing supremacy is a geopolitical priority for Europe, China, Canada, Australia and the United States. Advantage gained by acquiring the first computer that renders all other computers obsolete would be enormous and bestow economic, military and public health advantages to the winner. Congress is considering two new bills to establish the United States as a leader in quantum computing. In September, the House of Representatives passed the National Quantum Initiative Act to allocate $1.3 billion in five years to spur research and development. It would create a National Quantum Coordination Office in the White House to supervise research nationwide. A separate bill, the Quantum Computing Research Act by Sen. Kamala Harris, D-Calif., directs the Department of Defense to lead a quantum computing effort. “Quantum computing is the next technological frontier that will change the world and we cannot afford to fall behind,” Harris said in prepared remarks. “It could create jobs for the next generation, cure diseases and above all else make our nation stronger and safer. … Without adequate research and coordination in quantum computing, we risk falling behind our global competition in the cyberspace race, which leaves us vulnerable to attacks from our adversaries,” she said.
<urn:uuid:10a8b782-7b64-4d69-bd85-babcd711f0fe>
CC-MAIN-2022-49
https://www.rdworldonline.com/scientists-find-a-way-to-enhance-the-performance-of-quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711114.3/warc/CC-MAIN-20221206192947-20221206222947-00865.warc.gz
en
0.92597
1,401
3.5
4
Scientists have uncovered a mathematical shortcut for calculating an all-important feature of quantum devices. Having crunched the numbers on the quantum properties of 12,000 elements and compounds, researchers have published a new equation for approximating the length of time the materials can maintain quantum information, called “coherence time.” The elegant formula allows scientists to estimate the materials’ coherence times in an instant — versus the hours or weeks it would take to calculate an exact value. “People have had to rely on complicated codes and calculations to predict spin qubit coherence times. But now people can compute the prediction by themselves instantaneously. This opens opportunities for researchers to find the next generation of qubit materials by themselves.” — Shun Kanai, Tohoku University The team, comprising scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, the University of Chicago, Tohoku University in Japan and Ajou University in Korea, published their result in April in the Proceedings of the National Academy of Sciences. Their work is supported the Center for Novel Pathways to Quantum Coherence in Materials, an Energy Frontier Research Center funded by the U.S. Department of Energy, and by Q-NEXT, a DOE National Quantum Information Science Research Center led by Argonne. The team’s equation applies to a particular class of materials — those that can be used in devices called spin qubits. “People have had to rely on complicated codes and calculations to predict spin qubit coherence times. But now people can compute the prediction by themselves instantaneously,” said study co-author Shun Kanai of Tohoku University. “This opens opportunities for researchers to find the next generation of qubit materials by themselves.” Qubits are the fundamental unit of quantum information, the quantum version of classical computer bits. They come in different forms and varieties, including a type called the spin qubit. A spin qubit stores data in a material’s spin — a quantum property inherent in all atomic and subatomic matter, such as electrons, atoms and groups of atoms. Scientists expect that quantum technologies will be able to help improve our everyday lives. We may be able to send information over quantum communication networks that are impenetrable to hackers, or we could use quantum simulations to speed up drug delivery. The realization of this potential will depend on having qubits that are stable enough — that have long enough coherence times — to store, process and send the information. While the research team’s equation gives only a rough prediction of a material’s coherence time, it gets pretty close to the true value. And what the equation lacks in precision, it makes up for in convenience. It requires only five numbers — the values of five particular properties of the material in question — to get a solution. Plug them in, and voila! You have your coherence time. Diamond and silicon carbide are currently the best-established materials for hosting spin qubits. Now scientists can explore other candidates without having to spend days calculating whether a material is worth a deeper dive. “The equation is like a lens. It tells you, ‘Look here, look at this material — it looks promising,’” said University of Chicago Professor and Argonne senior scientist Giulia Galli, a co-author of the study and Q-NEXT collaborator. “We are after new qubit platforms, new materials. Identifying mathematical relationships like this one points out new materials to try, to combine.” With this equation in hand, the researchers plan to boost the accuracy of their model. They’ll also connect with researchers who can create the materials with the most promising coherence times, testing whether they perform as well as the equation predicts. (The team has marked one success already: A scientist outside the team reported that the relatively long coherence time of a material called calcium tungstate performed as predicted by the team’s formula.) “Our results help us with advancing current quantum information technology, but that’s not all,” said Tohoku University Professor Hideo Ohno, who is currently president of the university and paper co-author. “It will unlock new possibilities by bridging the quantum technology with a variety of conventional systems, allowing us to make even greater progress with the materials we’re already familiar with. We’re pushing more than one scientific frontier.” The other authors of the paper are F. Joseph Heremans, Argonne and UChicago; Hosung Seo, Ajou University; Gary Wolfowicz, Argonne and UChicago; Christopher P. Anderson, UChicago; Sean E. Sullivan, Argonne; Mykyta Onizhuk, UChicago; and David D. Awschalom, Argonne and UChicago. This work was supported by the Center for Novel Pathways to Quantum Coherence in Materials, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, in collaboration with the U.S. Department of Energy Office of Science National Quantum Information Science Research Centers. Q-NEXT is a U.S. Department of Energy National Quantum Information Science Research Center led by Argonne National Laboratory. Q-NEXT brings together world-class researchers from national laboratories, universities and U.S. technology companies with the single goal of developing the science and technology to control and distribute quantum information. Q-NEXT collaborators and institutions will create two national foundries for quantum materials and devices, develop networks of sensors and secure communications systems, establish simulation and network testbeds, and train a next-generation quantum-ready workforce to ensure continued U.S. scientific and economic leadership in this rapidly advancing field. For more information, visit https://www.q-next.org. Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.
<urn:uuid:4d9562e5-aa30-4b5a-ba31-6b807e75d5dc>
CC-MAIN-2022-49
https://www.anl.gov/article/a-mathematical-shortcut-for-determining-quantum-information-lifetimes
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710719.4/warc/CC-MAIN-20221130024541-20221130054541-00104.warc.gz
en
0.905184
1,410
3.71875
4
NASA quantum computer efforts will combine the space agency’s deep expertise in computing with its scientific ambition. The National Aeronautics and Space Administration — or NASA — is known as one of the key organizations that propelled humankind’s small steps and giant leaps into outer space. What many do not realize is that NASA scientists were also leaders in efforts to master computers and supercomputers, an expertise that led to computational innovations that went beyond space travel, including advances in structural analysis software and satellite imaging advances. Now, pioneering NASA quantum computer scientists plan to continue this legacy of scientific exploration to tap the inner reaches of quantum mechanics, work that could build the technologies that may propel humanity farther into space while also helping solve some closer-to-world problems, such as climate change and pollution control. NASA Quantum Computer History The history of NASA quantum computer efforts go back decades and are centered mainly in the organization’s Ames Research Center, which coincidentally or not, is located in Silicon Valley. Computational pioneer and Ames center director Hans Mark commissioned the first massively parallel computer at Ames. This computer uses multiple processors at the same time, or in parallel, and this advanced computing device offers a hint at NASA quantum computer ambitions. Those ambitions led to the creation of the Quantum Artificial Intelligence Laboratory (QuAIL), which is where the organizations conducts research to explore quantum computing and how it might be able to power NASA into the future — and into deep space. According to NASA, the lab conducts research on quantum applications and algorithms, develops tools for quantum computing and investigates the fundamental physics behind quantum computing. NASA Quantum Computer Use Cases Because many of NASA’s duties require large-scale computing efforts, the space agency’s quantum computers could tackle several tasks. Early versions of quantum computers, such as the Noisy Intermediate Stage Quantum — or NISQ — devices — could be used for planning and scheduling, fault diagnosis and machine learning, according to a NASA research paper. Other use cases would include building robust, secure communication networks and simulating many-body systems for material science and chemistry investigations. “For the last few years, the NASA Quantum Artificial Intelligence Laboratory (QuAIL) has been performing research to assess the potential impact of quantum computers on challenging computational problems relevant to future NASA missions.” NASA’s quantum computing projects are going on right now. According to the authors of the paper: “For the last few years, the NASA Quantum Artificial Intelligence Laboratory (QuAIL) has been performing research to assess the potential impact of quantum computers on challenging computational problems relevant to future NASA missions. A key aspect of this research is devising methods to most effectively utilize emerging quantum computing hardware.” Quantum sensing is another important use case for NASA quantum computers. NASA is also researching different quantum computer modalities. The teams are investigating both quantum annealing and gate-model quantum computers. NASA Quantum Computer — the Partnerships Not all of NASA quantum computer work is done in house. The administration relies on numerous partnerships throughout the quantum computing ecosystem to investigate and advance NASA’s quantum computing explorations. These partnerships include collaborations with other government research institutions, quantum labs, large corporations and startups. It is important to note that NASA was part of the team that helped Google establish quantum supremacy in 2019. Some of the NASA quantum computer partnerships include Google, Oak Ridge National Laboratory, or ORNL and Rigetti. NASA’s QuAIL is also part of two of the Department of Energy’s centers under the National Quantum Initiative, specifically the Co-design Center for Quantum Advantage and Superconducting Quantum Materials and Systems Center. Rigetti has teamed with partners, including Defense Advanced Research Projects Agency (DARPA) and NASA, to work on quantum computer approaches to scheduling problems. On that partnership, Mandy Birch, Senior Vice President, Engineering Strategy at Rigetti, said: “We’re honored to be chosen by DARPA and believe we are uniquely positioned to demonstrate quantum advantage for this class of problem. We believe strongly in an integrated hardware and software approach, which is why we’re bringing together the scalable Rigetti chip architecture with the algorithm design and optimization techniques pioneered by the NASA-USRA team.” Cold atom quantum computer pioneer is also a NASA partner. ColdQuanta’s equipment, for example, are used in the International Space Station. NASA Quantum Computer — the Future As NASA’s space ambitions increase, we would expect that its quantum computing ambitions will go — one might even say boldly go — right along with its drive toward deep space. In fact, because quantum computing is in its infancy the administration speculates that NASA quantum computer project will evolve and accelerate rapidly with other missions. According to NASA: “Quantum computing is a field of study in its infancy. So far, it is too early to implement quantum computing into NASA missions. The role of QuAIL is to investigate quantum computing’s potential to serve the agency’s future needs, for missions yet to be proposed or even imagined.” There are several directions NASA quantum computer research would be expected to go beyond the day-to-day tasks that those devices could help the space agency. Quantum secure satellites could provide snoop-proof communications for national security groups and the military. NASA’s combined expertise in both quantum computing and satellite technology would make the space agency a natural fit for work to make and launch these ultra-secure systems. Deep space travel will also require new forms of propulsion and even new space craft designs. NASA quantum computers directed at materials research could assist in analyzing measurements of new types of thrusters, for example. They could also be used to determine what types of materials could be used for space crafts and even custom design materials to the exacting designs required by long-term space travel, for example.
<urn:uuid:aa2973d6-c084-4f5e-ad31-db8bd9a4823e>
CC-MAIN-2022-49
https://thequantuminsider.com/2022/08/31/nasa-quantum-computer-mission-boldly-goes-from-hilbert-space-to-outer-space/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710902.80/warc/CC-MAIN-20221202114800-20221202144800-00146.warc.gz
en
0.922612
1,215
3.703125
4
As powerful as quantum computers may one day prove, quantum physics can make it challenging for the machines to carry out quantum versions of the most basic computing operations. Now scientists in China have created a more practical quantum version of the simple AND operation, which may help quantum computing reach successful near-term applications. Conventional electronics nowadays rely on transistors, which flick on or off to symbolize data as ones and zeroes. They connect transistors together to build devices known as logic gates, which implement logical operations such as AND, OR, and NOT. Logic gates are the building blocks of all digital circuits. In contrast, quantum computers depend on components known as quantum bits or “qubits.” These can exist in a quantum state known as superposition, in which they are essentially both 1 and 0 at the same time. Quantum computers work by running quantum algorithms, which describe sequences of elementary operations called quantum logic gates applied to a set of qubits. “Our work will help narrow the gap between the most anticipated near-term applications and existing noisy devices.” —Fei Yan, Southern University of Science and Technology, Shenzhen, China Superposition essentially lets each qubit perform two calculations at once. The more qubits a quantum computer has, the greater its computational power can grow in an exponential fashion. With enough qubits, a quantum computer could theoretically vastly outperform all classical computers on a number of tasks. For instance, on quantum computers, Shor’s algorithm can crack modern cryptography, and Grover’s algorithm is useful for searching databases at sometimes staggering speeds. However, quantum computers face a physical limitation: All quantum operations must be reversible in order to work. In other words, a quantum computer may perform an operation only if it can also carry out an opposite operation that returns it to its original state. (Reversibility is necessary until a quantum computation is run and its results measured.) In everyday life, many actions are reversible—for example, you can both tie and untie shoelaces. Others are irreversible—for instance, you can cook an egg but not uncook it. Similarly, a number of logical operations are reversible—you could apply the NOT operation to a variable and then apply it again to return it to its original state. Others are generally irreversible—you could add 2 and 2 together to get an outcome of 4, a mathematical version of the AND operation, but you could not reverse the operation and know an outcome of 4 began as 2 and 2 unless you knew what at least one of the original variables was. The AND gate is a fundamental ingredient of both classical and quantum algorithms. However, the demand for reversibility in quantum computing makes it challenging to implement. One workaround is to essentially use an extra or “ancilla” qubit for each AND gate that stores the data needed to reverse the operation. However, quantum computers are currently noisy intermediate-scale quantum (NISQ) platforms, meaning their qubits number up to a few hundred at most and are error-ridden as well. Given quantum computing’s primitive state right now, it would prove “extremely cumbersome to design and build hardware for accommodating extra ancilla qubits on an already crowded processor,” says study cosenior author Fei Yan, a quantum physicist at the Southern University of Science and Technology in Shenzhen, China. “Our technique presents a scaling advantage. The more qubits are involved, the more cost-saving our technique would be compared to the traditional one.” Now Yan and his colleagues have constructed a new quantum version of the AND gate that removes this need for ancilla qubits. By getting rid of this overhead, they say, their new strategy could make quantum computing more efficient and scalable than ever. “Our work will help narrow the gap between the most anticipated near-term applications and existing noisy devices,” Yan says. “We hope to see quantum AND functionality added to quantum programs on machines elsewhere, such as the IBM quantum cloud, and played with by more people.” Instead of using ancilla qubits, the new quantum AND gate relies on the fact that qubits often can encode more than just zeroes and ones. In the new study, the researchers have qubits encode three states. This extra state temporarily holds the data needed to perform the AND operation. “We do not use any ancilla qubits,” Yan says. “Instead, we use ancilla states.” In the new study, the scientists implemented quantum AND gates on a superconducting quantum processor with tunable-coupling architecture. Google also employs this architecture with its quantum computers, and IBM plans to start using it in 2023. “We think that our scheme is well-suited for superconducting qubit systems where ancilla states are abundant and easy to access,” Yan says. In experiments, the researchers used their quantum AND gate to help construct Toffoli gates, with which quantum computers can implement any classical circuit. Toffoli gates are key elements of many quantum-computing applications, such as Shor’s and Grover’s algorithms and quantum error-correction schemes. In addition, with six qubits the researchers could run Grover’s algorithm on a database with up to 64 entries. “To our knowledge, previous demonstrations of Grover’s search on any system was limited to 16 entries,” Yan says. This highlights the way in which the quantum AND operation can help scale up quantum computing, he adds. All in all, “what we really want to emphasize is that our technique presents a scaling advantage,” Yan says. “The more qubits are involved, the more cost-saving our technique would be compared to the traditional one.” Although these experiments were conducted with superconducting qubits, Yan notes that their quantum AND gate could get implemented with other quantum-computing platforms, “such as trapped ions and semiconductor qubits, by utilizing appropriate ancilla levels.” The scientists detailed their findings online 14 November in the journal Nature Physics. - Electronic Gate Built For Silicon Quantum Computers - IEEE Spectrum › - The First Two-Qubit Logic Gate in Silicon - IEEE Spectrum › - Quantum Gate 100x Faster Than Quantum Noise ›
<urn:uuid:123fef95-d480-466c-8fb6-ee6e10412536>
CC-MAIN-2022-49
https://spectrum.ieee.org/quantum-and-gate
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711017.45/warc/CC-MAIN-20221205132617-20221205162617-00787.warc.gz
en
0.934573
1,332
4.0625
4
USC (US) — Researchers have built a quantum computer in a diamond, the first of its kind to include protection against harmful noise called “decoherence.” The demonstration showed the viability of solid-state quantum computers, which—unlike earlier gas- and liquid-state systems—may represent the future of quantum computing because they can easily be scaled up in size. Current quantum computers typically are very small and, though impressive, cannot yet compete with the speed of larger, traditional computers. A 20 micron x 20 micron magnification of the diamond chip, showing an integrated diamond lens above the single particle spins where the calculations take place. (Credit: Delft University of Technology/UC Santa Barbara) The multinational team included University of Southern California professor Daniel Lidar and postdoctoral researcher Zhihui Wang, as well as University of California, Santa Barbara physicist David Awschalom. The findings are published in Nature. The team’s diamond quantum computer system featured two quantum bits, or qubits, made of subatomic particles. As opposed to traditional computer bits, which can encode distinctly either a one or a zero, qubits can encode a one and a zero at the same time. This property, called superposition, along with the ability of quantum states to “tunnel” through energy barriers, some day will allow quantum computers to perform optimization calculations much faster than traditional computers. Like all diamonds, the diamond used by the researchers has impurities—things other than carbon. The more impurities in a diamond, the less attractive it is as a piece of jewelry because it makes the crystal appear cloudy. The team, however, utilized the impurities themselves. A rogue nitrogen nucleus became the first qubit. In a second flaw sat an electron, which became the second qubit. (Though put more accurately, the “spin” of each of these subatomic particles was used as the qubit.) Electrons are smaller than nuclei and perform computations much more quickly, but they also fall victim more quickly to decoherence. A qubit based on a nucleus, which is large, is much more stable but slower. “A nucleus has a long decoherence time—in the milliseconds. You can think of it as very sluggish,” says Lidar. Though solid-state computing systems have existed before, this was the first to incorporate decoherence protection—using microwave pulses to continually switch the direction of the electron spin rotation. “It’s a little like time travel,” Lidar says, because switching the direction of rotation time-reverses the inconsistencies in motion as the qubits move back to their original position. “Although interactions between a quantum bit (‘qubit’) and its environment tend to corrupt the information it stores, it is possible to dynamically control qubits in a way that facilitates the execution of quantum information-processing algorithms while simultaneously protecting the qubits from environment-induced errors,” says Awschalom. The team was able to demonstrate that its diamond-encased system does indeed operate in a quantum fashion by seeing how closely it matched “Grover’s algorithm.” The algorithm is not new—Lov Grover of Bell Labs invented it in 1996—but it shows the promise of quantum computing. The test is a search of an unsorted database, akin to being told to search for a name in a phone book when you’ve only been given the phone number. Sometimes you’d miraculously find it on the first try, other times you might have to search through the entire book to find it. If you did the search countless times, on average, you’d find the name you were looking for after searching through half of the phone book. Mathematically, this can be expressed by saying you’d find the correct choice in X/2 tries—if X is the number of total choices you have to search through. So, with four choices total, you’ll find the correct one after two tries on average. A quantum computer, using the properties of superposition, can find the correct choice much more quickly. The mathematics behind it are complicated, but in practical terms, a quantum computer searching through an unsorted list of four choices will find the correct choice on the first try, every time. Though not perfect, Lidar and Wang’s computer picked the correct choice on the first try about 95 percent of the time—enough to demonstrate that it operates in a quantum fashion. “This demonstration of performing a quantum algorithm at the subatomic level with single spins suggests a pathway to build increasingly complex quantum machines, using qubit control protocols that circumvent the expected limitations from real materials,” says Awschalom. Researchers from Delft University of Technology in the Netherlands and Iowa State University also contributed to the research, which was funded by the National Science Foundation and the U.S. Army Research Office’s Multidisciplinary University Research Initiative. More news from USC: http://uscnews.usc.edu/
<urn:uuid:0dd38fa5-0dc4-4197-8fd2-3316023e1dbd>
CC-MAIN-2022-49
https://www.futurity.org/quantum-computer-built-inside-diamond/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710890.97/warc/CC-MAIN-20221202014312-20221202044312-00787.warc.gz
en
0.937727
1,083
3.9375
4
Prime Numbers, Encryption and the Linux Factor Command Have you ever needed to print the prime factors of a number on the Linux command line? Me neither. However, a tool does exist for it. Enter the factor command. The factor command is part of the GNU Core Utilities package, therefore it is available on almost any Linux system. This little beauty has the singular purpose of producing the prime factors of any number. To me, this is pretty neat. To anyone interested in learning cryptography or number theory, this may be a useful, if not fun, little utility. Prime Numbers and Prime Factors Prime numbers have long been subjects of great interest to mathematicians, especially in the field of combinatorics. They are interesting because they are whole numbers that are only divisible by themselves and one. For example, the only way to multiply two whole numbers together to produce '5' is '5 x 1=5', whereas '6' factors by '3x2=6' as well as by one and itself. Six is a composite number, numbers that are not prime numbers. According to Wolfram-Alpha, a Mersenne prime is a prime that fits the formula M = 2^n - 1. Or, one subtracted from any power of 2. They were named for Martin Mersenne, a 17th-century French monk that studied the numbers. There exists an infinite number of Mersenne prime numbers. Prime numbers can get large in and of themselves: the current, largest prime number has 24,862,048 digits! Multiplying prime numbers together, even large ones is a straightforward task. The product of two prime numbers is called a semi-prime. A cheap desk calculator can do this with ease. Plenty of people can count by prime numbers and multiply big numbers without paper using a variety of techniques. Prime factors are a different problem altogether. Any composite number can be made up of many different combinations of prime numbers. Simply put, prime factors are the prime numbers that can be factored from any number, other than 1 and itself. Factoring is the process of breaking down a number into the two numbers originally multiplied together. For example, 9 is the product of the prime number 3 and itself. This seems simple with very small numbers like 9, especially if you've had experience factoring hundreds of polynomials in high school. As with polynomials, prime factors become infinitely more complex the larger the numbers involved. Big Prime Number, Big Problem Multiplying big prime numbers, while still relatively easy, results in even bigger non-prime numbers. The number 330 has prime factors of 2, 3, 5, and 11. The larger your numbers get the more possible factorizations. Now, go through one by one and multiply each of those prime numbers together in different combinations until you get 330. Not impossible, but certainly more difficult than 9. Factoring very large numbers, like the Mersenne prime numbers can take powerful computers years to complete. Any method used to multiply and divide large numbers is useless when factoring prime numbers. Prime factors can only be found through trial and error. To find the correct pairing of primes to factor requires testing every prime against each other from 2 up to the nth prime, except for one and the very large number in question. Linux and its Factor command use an algorithm called Pollard-Brent Rho to derive prime factors for relatively small numbers. The algorithm is quite powerful and can calculate the eighth Fermat number, 2^256+1 in approximately 20 seconds, depending upon hardware constraints. Distributions not using the GNU MP will have reduced capability with this command. Prime Factors and Encryption Cryptography is an essential aspect of modern society. Computers transfer data to other computers all over the world every nanosecond of every day. Threats to the security and accuracy of information pathways challenge hardware and infrastructure improvements as fast as they are conceived. Globally, industries rely on encryption protocols to protect themselves and, sometimes only tangentially, consumers from identity theft, fraud, and violations of privacy. Prime factors are very useful in creating encryption keys to secure information over digital transmission media. Because all composite numbers are made up of prime numbers, any composite number can be used as the public key, that allows messages to be encrypted. Only those in possession of the secret key can decrypt the message into plain text. So any public key will be a very large composite number. The secret key will be the very large prime numbers, otherwise known as the prime factors of the composite number. Prime factor cryptography guarantees security and privacy by creating a factorization problem that even supercomputers, let alone the most advanced consumer electronics, would be hard-pressed to solve within a century. Because of this, some law enforcement entities seek to restrict cryptography and prime factor usage to prevent freedom fighters and terrorists alike from obtaining secure means of communication. It is fairly common to hear or read the phrase 1024-bit encryption. This describes the number used for the public key. The public key number used will be an integer with more than 2^1023 digits but less than 2^1024 digits. The secret key would be the two primes that produce this integer. While modern supercomputers cannot crack this encryption in any reasonable amount of time, quantum computing will eventually render this method useless. Linux Factor Command As nifty as the Factor command may be, it is not useful in modern cryptography. Since Factor cannot find the prime factors of large numbers within any reasonable amount of time, it would be to simplistic for modern cryptography. But it is useful for learning cryptography basics or simply enjoying the elegance of numbers. Factor Command Syntax and Options (or lack thereof) The factor command has no functional options. The only options that exist are --help and --version. It simply takes a argument or list of arguments in the form of an integer number. It will also accept an integer from standard input (STDIN). [[email protected] ~]$ factor 11 11: 11 [[email protected] ~]$ factor 77 77: 7 11 [[email protected] ~]$ factor 34578 11 77 34578: 2 3 3 17 113 11: 11 77: 7 11 The Linux factor command is a cool bit of computing history and it's interesting that it has remained a part of Unix since 1979. In 1986, Paul Rubin wrote a free software version of factor for the GNU project. Some UNIX/Linux variants consider factor a game rather than a utility.The current GNU documentation categorizes Factor as a numerical operation, which makes more sense in my opinion. It finds use with number theorists, number enthusiasts, and when you require simple derivation of prime factors. Resources and Links This site uses Akismet to reduce spam. Learn how your comment data is processed.
<urn:uuid:30949852-da89-4c81-b48f-aef041c4870a>
CC-MAIN-2022-49
https://www.putorius.net/factor-prime-numbers-encryption.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446711368.1/warc/CC-MAIN-20221208215156-20221209005156-00667.warc.gz
en
0.920984
1,411
3.5
4
Researchers at Google AI Quantum have announced a successful experiment in which for the first time a quantum computer has performed a task that ordinary computers based on integrated circuits are incapable of doing in a reasonable amount of time. This technical milestone paves the way for far-reaching advances in physics, chemistry, astronomy, materials science, machine learning and a host of other fields. The results were produced using Google’s quantum computer, dubbed Sycamore. It is the product of a collaboration between 75 scientists led by Frank Arute at Google, NASA, Oak Ridge National Laboratory and more than a dozen other facilities in Germany and the United States. They compared how fast their machine and the world’s most powerful supercomputer, Summit, could produce a random number from a specially designed circuit one million times. The experiment was then repeated multiple times on increasingly complex algorithms until they could show that while a quantum computer generated a result, a classical computer could not. During their final experiment, Sycamore produced its one million random numbers in 200 seconds. Summit was estimated to need 10,000 years to perform the same calculations. This exponential increase in computing speed is the first documented instance of so-called quantum supremacy. The term was popularized by John Preskill in 2011 to describe the set of problems that are shown to be intractable for even the best modern computers but that should be relatively straightforward for the quantum computers being developed, thus providing a measure to determine if a given quantum computer had in fact surpassed the computational ability of conventional electronics. Quantum supremacy also defines certain engineering milestones. While quantum computers have always held the promise of being able to do exponentially more processes per second than conventional machines, they have proven exponentially more difficult to build and maintain. It was not at all clear that quantum computers would in practice ever surpass supercomputers. Nonetheless, Google’s research indicates that there is at least one case where quantum computers are supreme, and suggests that there are many others. The end goal, however, is not just to produce random numbers. An off-the-shelf laptop can produce a million random numbers in seconds if the algorithms used to produce them are not purposefully made complicated, as were the test cases for Sycamore and Summit. Rather, quantum computers have in theory the capability of solving in minutes problems that even the best supercomputers would likely not solve in the lifespan of our solar system. Two of these include simulating the motion of atomic and subatomic particles and factoring integers of several hundred digits. To solve them, one must go beyond familiar binary models of computation which are used in today’s personal computers, tablets and phones. These devices store and process information in their memory using distinct physical states, usually some sort of switch being turned off or on, and the data they contain is often described as a sequence of the symbols 0 and 1. One unit of information, a bit, consists of either a 0 or 1 and the number of bits, usually discussed as bytes (where one byte equals eight bits), is the measure of the size of a computer’s memory. This method of storing and retrieving information takes a small but finite amount of time, an amount which is not noticeable for a single calculation yet can grow large very quickly. High-end modern laptops can perform tens of billions of operations per second while the Summit supercomputer is capable of 148 million billion operations per second. And yet, while Summit could multiply two 300-digit numbers almost instantaneously, it would take the supercomputer—using its most advanced algorithms—billions of years to factor the product. A quantum computer is hypothesized to be able to perform the same operation in minutes. The original rationale for quantum computers was not to factor large numbers, a key part in certain types of encryption, but to directly simulate rather than approximate quantum mechanics. This field of physics, the study of the motion of matter at its smallest scales, is inherently probabilistic. The position and momentum of a particle are not, as in our everyday life, described as a pair of numbers but as two sets of well-defined probabilities. In the early 1980s, Soviet mathematician Yuri Manin and American physicists Paul Benioff and Richard Feynman realized that if a machine could be devised to perform operations using this property of matter, it would be able to calculate the motion of matter exactly as it occurs in nature. Instead of switches, Manin, Benioff and Feynman proposed to store information in a fundamental particle such as a photon, the basic unit of light. The value of the “qubit” is stored within the inherent rotation of the photon, which is either positive or negative. The difference between a bit and a qubit, and this is key, is that a qubit initially has both the positive and negative values. Only when the photon interacts with some external particle or wave will it fall into a single state, and it will do so following the probabilistic laws of quantum mechanics. This is known as “state superposition.” In addition to superposition, quantum computing also takes advantage of a second property of fundamental particles known as “entanglement.” It is possible to take two (or more) particles and force them to interact in such a way that even though separated, each particle acts as part of the same system. What results from this is the ability to act on a single entangled particle, which instantaneously acts on all others within the entangled system. The combination of state superposition and entanglement is what make quantum computers so much more powerful than classical computers. A computer with 266 bits can store or process 266 pieces of information at a time. A quantum computer with 266 qubits can store or process 2^266 (10^80, a one followed by eighty zeros) pieces of information at a time, a number equivalent to the number of atoms in the observable universe. Yet qubits are incredibly difficult to operate on. The particles that are storing information react with their surroundings, either nearby matter or the so-called vacuum of spacetime, which is not “nothing” but in fact a constant creation and annihilation of particles. This can cause unknown but definite interactions—called quantum decoherence—with one particle which translates to each other particle with which it is entangled, forcing researchers to reset the entire system. Each particle serving as a qubit must be isolated as much as possible from these unwanted connections, typically by physically isolating them and cooling their surroundings to temperatures close absolute zero. While it is impossible to suppress all quantum decoherence, for that would involve stopping the motion of matter, an impossibility, a great deal of research from groups around the world has gone into eliminating most of the extraneous motion. This effort is what has allowed Arute’s team to successfully align and operate Sycamore, which consists of 53 working qubits, outperforming the world’s most powerful supercomputer, which consists of many trillions of bits. This technology is expected to herald advances in a variety of fields. Quantum computers, when they are more capable of surpassing supercomputers in all problems, not just one, will be able to more quickly and accurately find exoplanets, determine the properties of new materials, study the outcome of chemical reactions, and produce more advanced forms of artificial intelligence. They are at the same time a striking confirmation of humanity’s ability to understand and master nature. Quantum computers under capitalism, however, have the capacity for reinforcing oppression. Standard encryption schemes will be broken in minutes or seconds, giving nations or corporations the ability to spy on their rivals and the working class, as well as infiltrate, control and destroy the electronic systems of whole countries. Employees at their workplace can be tracked with even greater efficiency and forced to work longer and harder. Immigrants can be hunted down with facial recognition and other forms of tracking with increased ease. And the algorithms used by Google, Facebook and other tech companies in conjunction with the US military and intelligence agencies will have an unparalleled ability to censor the internet, particularly left-wing, anti-capitalist and socialist publications. While Google’s Sycamore quantum computer is nowhere near capable of such feats, the social and political consequences of a private company or a capitalist government having control of such a machine must be understood. At the same time, this must galvanize struggle against capitalism and for the establishment of a society where such vast and fundamental advances can be changed from tools of violence and repression to instruments for securing a prosperous and fulfilling life for all people.
<urn:uuid:36478bca-7ae1-40e2-99eb-489ea50439c2>
CC-MAIN-2022-49
https://www.wsws.org/en/articles/2019/10/26/quan-o26.html
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710890.97/warc/CC-MAIN-20221202014312-20221202044312-00788.warc.gz
en
0.952584
1,750
3.796875
4
Quantum computers are not like classical computers. I don't mean that in the sense that quantum computers perform calculations in a different manner, or that they might be faster, or more clever. No, I mean that quantum computers come with a whole set of issues (read: headache-inducing problems) that normal computers don't. To reduce these problems, researchers have taken to hiding quantum information, albeit not very successfully. It turns out that using more than one type of qubit offers a bit more camouflage to quantum information. Quantum hide and seek Before we get to the latest results, let me paint a picture of pain for you. In a quantum computer, calculations are achieved by manipulating the value of a target qubit—the quantum computing equivalent of a bit—in a way that depends on the value of other qubits. The problem is doing this cleanly. In most cases, any qubits in a system are identical, so if I have a tool that can change one qubit, the same tool will change the neighboring qubits. These tools are unavoidably blunt, so modifying one qubit has a good chance of changing its neighbors. Let's look at a specific example: a quantum computer that consists of a string of ions sitting in a trap (an ion is an atom with a missing electron). The ions influence each other by the way they collectively rock back and forth in the trap. This collective motion is used to couple qubits together, but it is very easy to disrupt. Imagine that I want to set the qubit state of the central ion. To do that, I have to shine a laser onto it; it will (eventually) absorb a photon, changing its state. But nothing says that it will absorb the first photon that hits it. A photon that is not absorbed will be scattered, like a pinball off a bumper. That recoil changes the motion of the ion in the trap, disrupting the collective motion of all ions. This reduces the effectiveness of (and eventually kills off) the collective behavior that's needed for quantum computation. But wait—it gets worse. The scattered photon can hit a neighboring qubit and be absorbed. If that happens, you have introduced an error in your computation. You may have intended to set the state of qubit No.3, but you have also changed the state of qubit No.2 as well. To solve this problem, a group of researchers has shown how to use a quantum bystander to maintain the state of the qubits for much longer. Instead of using a string of identical ions, the researchers use two different ions. Beryllium ions are used for computation, and, in between each beryllium ion, they place a calcium ion. This protects quantum information in several ways. The photons scattered from the beryllium ions can't easily reach other beryllium ions because the calcium ion is in the way. The calcium ion requires an entirely different color of light, so the scattered light from the beryllium ion doesn't change the quantum state of the calcium ion, while light scattered from the calcium ion doesn't affect the beryllium ion. Yet these neighbors are not completely isolated from each other. The qubits are still coupled through the motion of the ions in the trap. Here, the calcium ion also plays a role. When the ions absorb or scatter light, they get a kick that makes their motion in the trap more vigorous. This motion needs to be controlled so that the links between qubits remain under control. To do this, the researchers can slow the calcium ions down (using lasers, naturally). By slowing the calcium ion down, the researchers suck energy out of all the trapped ions, bringing them back under control. But what is really cool is how the researchers brought it all together in a three-qubit demonstration system (two beryllium ions and one calcium ion). The researchers put the calcium ion in a known quantum state, then perform a set of operations on all three qubits. Imperfections mean that there will eventually be some difference between the intended quantum state (i.e., the quantum information) and the target quantum state. This difference will grow with time thanks to the ions all having a slightly different environment. This difference is revealed (at least partially) by measuring the state of the calcium ion. This can be done without destroying the quantum state of the beryllium ions. In response to the measured state of the calcium ion, the trap and the state of the beryllium ions are carefully adjusted. Then the calcium ion is cooled and its state is reset. From there, the entire operation of coupling the calcium ions with the beryllium ions can be repeated. The researchers compared the reliability of their qubit state (including entangled states) with and without the trick of adjusting the trap and the state of the beryllium ions. Without adjustment, the quantum information stored in the beryllium ions quickly decays away. However, with these careful corrections, the researchers were able to perform 50 operations on the beryllium ions without losing the quantum state. The researchers' control system is not perfect—the information still decays away, but the decay rate is a good 20 times slower than it would be if they were just using two beryllium ions. The best bit, though, is that there is nothing stopping the researchers scaling up to more ions. Three qubits is puny compared to other quantum computers. But hitting nine-plus qubits should be possible, which is about state of the art for ion-based quantum computers. Furthermore, the cooling and control should allow for scaling to even larger numbers of qubits. It's all pretty exciting.
<urn:uuid:9e592ae9-a6af-464c-a8f8-533b5d41dc7d>
CC-MAIN-2022-49
https://japandailysun.com/2018/11/25/like-kids-a-little-separation-keeps-qubits-calmer-for-longer/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710829.5/warc/CC-MAIN-20221201153700-20221201183700-00427.warc.gz
en
0.920732
1,213
3.546875
4
From Santa Barbara, California, to Hefei, China, scientists are developing a new type of computer that will make today’s machines look like toys. Harnessing the mysterious power of quantum mechanics, the technology will do in minutes what even supercomputers have been unable to do for thousands of years. In the fall of 2019, Google unveiled an experimental quantum computer that showed it was possible.Two years later, a laboratory in China did a lot of the same. But quantum computing won’t reach its potential without the help of another technological breakthrough. Call it the “quantum internet” – a network of computers that can send quantum information between remote machines. At Delft University of Technology in the Netherlands, a team of physicists has taken a major step toward the computer network of the future, using a technique called quantum teleportation to send data across three physical locations. Previously, only two could do this. New experiments show that scientists can scale quantum networks across a growing number of sites. “We are now building small quantum networks in the laboratory,” said Delft physicist Ronald Hanson, who led the team. “But our idea is to eventually build a quantum internet.” Their study was published this week A paper published in the scientific journal Nature, demonstrating the power of phenomena that Albert Einstein once thought impossible. Quantum teleportation – he called it “horror from afar“—information can be transferred between locations without actually moving the physical matter that holds it. This technology could profoundly change the way data is transferred from one place to another. It draws on more than a century of research involving quantum mechanics, a field of physics that dominates the subatomic realm and behaves differently than anything we experience in our daily lives. Quantum teleportation not only moves data between quantum computers, but also in a way that no one can intercept. “Not only does this mean that a quantum computer can solve your problem, but it doesn’t know what the problem is,” says Tracy Eleanor Northup, a researcher at the Institute of Experimental Physics at the University of Innsbruck, who is also exploring quantum teleportation. “It doesn’t work that way today. Google knows what you’re running on its servers.” If certain objects are very small (like electrons or particles of light) or very cold (like exotic metals cooled to almost absolute zero or minus 460 degrees Fahrenheit), quantum computers take advantage of the strange way they behave. In these cases, a single object can behave like two separate objects at the same time. Traditional computers perform computations by manipulating “bits” of information, each bit containing either a 1 or a 0. By exploiting the strange behavior of quantum mechanics, a qubit or qubit can store a combination of 1s and 0s – a little like how a spinning coin holds the tantalizing possibility that when it ends up flat on a table it will Appears head or tail. This means that two qubits can hold four values at the same time, three qubits can hold eight, four can hold 16, and so on. As the number of qubits grows, the capabilities of quantum computers will increase exponentially. Researchers believe these devices could one day accelerate the development of new drugs, advance advances in artificial intelligence, and quickly crack the encryption that protects computers vital to national security. Globally, governments, academic labs, start-ups and tech giants are spending billions to explore the technology. In 2019, Google announced that its machine had achieved what scientists call “quantum supremacy,” meaning it can perform experimental tasks that conventional computers can’t. But most experts believe it will be at least a few more years — at least — before quantum computers can actually do something useful that you can’t do with another machine. Part of the challenge is that if you read from it, the qubit breaks or “falls out” — it becomes a normal bit that can only hold 0 or 1, but not both. But by stringing together many qubits and developing ways to prevent decoherence, scientists hope to make machines that are both powerful and practical. Ultimately, ideally, these will be joined to networks that can send information between nodes, allowing them to be used anywhere, just as cloud computing services from companies like Google and Amazon make processing power widely available today. But this also has its own problems. Due in part to decoherence, quantum information cannot simply be copied and sent over traditional networks. Quantum teleportation offers another option. While it can’t move objects from one place to another, it can move information using a quantum property called “entanglement”: a change in the state of one quantum system instantly affects the state of another, distant quantum system. “After entanglement, you can no longer describe these states individually,” Dr. Northup said. “Fundamentally, it’s a system now.” These entangled systems could be electrons, particles of light, or other objects. In the Netherlands, Dr Hansen and his team used so-called nitrogen vacancy centers – tiny spaces in synthetic diamonds where electrons can be trapped. The team constructed three quantum systems, named Alice, Bob, and Charlie, and connected them in a straight line with multiple strands of optical fiber. Scientists can then entangle these systems by sending individual photons — particles of light — between them. First, the researchers entangled two electrons—one belonging to Alice and the other to Bob. In effect, the electrons are given the same spin and thus bind or entangle together in a common quantum state, each storing the same information: a specific combination of 1s and 0s. The researchers could then transfer this quantum state to another qubit within Bob’s synthetic diamond, the carbon nucleus. Doing so freed Bob’s electron, which the researchers could then entangle with another electron belonging to Charlie. By performing specific quantum operations on Bob’s two qubits (the electron and the carbon core), the researchers were able to bond the two entanglements together: Alice plus Bob bond to Bob plus Charlie. Result: Alice is entangled with Charlie, which allows data to travel across all three nodes. When data is transmitted in this way, there is no need to actually transmit the distance between nodes, and there is no loss. “Information can be fed into one side of the connection and then appear on the other side,” Dr. Hansen said. Information also cannot be intercepted. A future quantum internet powered by quantum teleportation could provide a theoretically unbreakable new type of encryption. In the new experiment, the network nodes were not far apart—only about 60 feet apart. But previous experiments have shown that quantum systems can be entangled over longer distances. The hope is that, after several years of research, quantum teleportation will be able to span miles. “We’re now trying to do this outside the lab,” said Dr. Hansen.
<urn:uuid:74c69159-1876-46ea-a31e-d667304f450b>
CC-MAIN-2022-49
https://viraltechonly.com/2022/05/31/quantum-internet-is-getting-closer-as-data-transfer-advances/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710662.60/warc/CC-MAIN-20221128203656-20221128233656-00029.warc.gz
en
0.923938
1,451
3.671875
4
By Amar Shah When the mathematical rules for quantum mechanical theory were first created, Niels Böhr and Werner Heisenberg proposed a way to interpret these rules and explain their physical implications: this became known as the Copenhagen interpretation of quantum mechanics. The idea of superposition is instrumental in this: that until the property of a particle is measured, it can be thought of as in two different states at the same time. The most famous illustration of this is Schrödinger’s cat. If you leave a cat in a box, after a period of time you no longer know whether the cat is dead or alive. Thus, the cat is in a superposition of being dead and alive. If you open the box to find a dead cat, then sometime while the cat was in the box it went from alive to dead. Böhr also created a model for the movement of electrons rotating around an atom’s nucleus like planets rotate around the sun. In the atomic case, there are specific energy levels that electrons can have. This can be thought of as specific orbitals that electrons follow around the nucleus of an atom(represented by integers n = 1, 2, 3, …). Böhr was one of the first to speculate about the changes in these energy levels. He hypothesized that changes in an electron’s orbit are like changes in the “aliveness” of Schrödinger’s cat. He called this change in an electron’s orbit a quantum jump and predicted that they occur with estimable probabilities, but are random and instantaneous unless you are continuously monitoring them. However, Zlatko Minev’s new experiment observes a quantum jump between different energy levels of an artificial three energy level atom and concludes that it is “continuous, coherent, and deterministic.” Not only is Minev’s team able to predict when the jump is about to occur, but the jump itself is not an instantaneous event as Böhr predicted, but a continuous change in the energy level. Minev creates an artificial atom with three energy levels: Ground, Bright, and Dark. These energy levels can be thought of as similar to the energy levels or orbitals of an electron. When electrons jump to the lower energy level, the system emits a photon. Minev exploits this in order to make his measurements for quantum jumps between the Ground and Bright Levels. The excitation (increase in the particle’s energy level) to the Bright level is recorded by a photodetector that measures photons emitted. Each time a photon is detected, it registers as a click, which alerts the experimenter that a quantum jump from Ground to Bright has occurred. If there are not many clicks for a period of time, one can infer through the process of elimination that quantum jumps from Ground to Dark are occurring. While the photodetector generally has a poor collection efficiency and oftentimes misses photons or “clicks,” Minev’s experimental set-up minimizes the error in photodetection. Researchers run experiments to take note of when such “clicks” may stop. Even though they do not directly measure the change from Ground to Dark, researchers use this to detect an advance warning signal for the quantum jump. Researchers test a different version of the experiment in which they wait for the clicks to stop and subsequently suspend all system drives. Doing so freezes the evolution of the system causing all changes in energy level to stop. From here, they are able to reverse the trajectory of a quantum jump mid-flight. This means that the energy level will return to the Ground state. These results have large consequences for many fields, namely quantum computing. Quantum computers use artificial atoms, called qubits, that are useful for storing quantum information. Sometimes, there are quantum jumps in the qubits which may cause errors in the calculations of quantum computers. Having an advanced warning of these jumps can help researchers mitigate these errors. Beyond that, these results could cause a large shift in how people think about quantum mechanics. Quantum jumps are not always completely random and spontaneous, but can be predictable and even reversible. Bigblueboo. “Loop Physics GIF by Bigblueboo.” GIPHY, GIPHY, 19 Jan. 2020, giphy.com/gifs/bigblueboo-physics-atom-bohr-ToMjGplMhvFmZ6GdWCI. Faye, Jan, “Copenhagen Interpretation of Quantum Mechanics”, The Stanford Encyclopedia of Philosophy (Winter 2019 Edition), Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/win2019/entries/qm-copenhagen/>. Minev, Z. K. et al. “To Catch and Reverse a Quantum Jump Mid-Flight.” Nature 570.7760 (2019): 200–204. Crossref. Web. Gleiser, Marcelo. “The Idea That Changed The World: 100 Years Of Quantum Jumps.” NPR, NPR, 14 Aug. 2013, www.npr.org/sections/13.7/2013/08/14/211650524/the-idea-that-changed-the-world-100-years-of-quantum-jumps. Pitkanen, M. “Copenhagen Interpretation Dead: Long Live ZEO Based Quantum Measurement Theory!” Research Gate, www.researchgate.net/publication/335882247_Copenhagen_interpretation_dead_long_live_ZEO_based_quantum_measurement_theory Yale University. “Physicists can predict the jumps of Schrödinger’s cat (and finally save it).” ScienceDaily. ScienceDaily, 3 June 2019. <www.sciencedaily.com/releases/2019/06/190603124621.htm>.
<urn:uuid:ea30fbcc-30bc-4bc8-b8d2-38a331055f86>
CC-MAIN-2022-49
https://bsj.berkeley.edu/new-experiments-can-predict-occurrences-of-quantum-jumps-may-require-scientist-to-reevaluate-old-theories/
s3://commoncrawl/crawl-data/CC-MAIN-2022-49/segments/1669446710765.76/warc/CC-MAIN-20221130160457-20221130190457-00589.warc.gz
en
0.911474
1,262
3.609375
4
Morphing DNA makes motorBy Kimberly Patch, Technology Research News DNA molecules are prime candidates for helping humans make microscopic machines because they have a long history of assembling things on the molecular scale. Every one of a human's 75 to100 trillion cells exists because a DNA molecule automatically unzipped, created the duplicate a cell needs to divide, then folded itself neatly back up again. Researchers at New York University have taken a significant step forward in being able to instruct artificial DNA molecules to move in specific ways with a method that allows certain portions of DNA to bind to each other, and then release. This reversible binding method allows for control of the shape of a DNA molecule, or machine. The researchers demonstrated the mechanism by making a four-step rotary motor out of DNA. The motor is a four-stranded DNA molecule that, prompted by separate strands of DNA, will go through a mechanical cycle over and over again. Because the process is a reversible cycle, there are no waste products. The four-stranded DNA molecule is essentially a pair of double helixes of DNA connected at several points along their lengths. When the researchers add molecules of control DNA to a solution full of the motor molecules, the short, single-stranded control molecules join with the larger molecules and rearrange them by connecting two of the double strands in one place and cutting them in another. The researchers then remove the control strands using fuel strands of DNA, which are also short single-stranded lengths of DNA. This leaves the motor molecule in a different physical shape than when it started -- the end of one double strand of the DNA is rotated 180 degrees relative to the strands next to it. The process can be reversed by adding a different type of control strand to the solution, and that control strand can also be removed by a different type of fuel strand after it changes the molecule back. "The system can be cycled numerous times... and there are no breakdown products," said Nadrian Seeman, a chemistry professor at New York University. The process can be adapted to many different sequences of DNA, said Seeman. "Many different species of this device can be made by changing the sequences in the region where the... strands bind," he said. This means a wide range of similar rotary devices can be created by changing the fuel strands and the places where they bind, he said. Ten different molecules can result in 1,024 different structures, for instance. The researchers are currently working on a method to insert the DNA devices into molecular lattices, said Seeman. This would enable still more structures. An array of four by four molecules, for instance, could produce 65,536 different shapes. "This may enable us to build nanofabrication facilities to produce new molecular species," he said. The range of motion the molecular motors can produce ranges from .04 to 4 nanometers, but the researchers have produced motions as large as 35 nanometers using arrays, according to Seeman. A nanometer is one millionth of a millimeter. On this scale, an E. coli bacterium is a relative giant, with a girth of 1 micron, or 1,000 nanometers. A line of ten carbon atoms measures about one nanometer. The research is "great stuff," said Erik Winfree, an assistant professor of computer science and computation and neural systems at the California Institute of Technology. The method is a step forward in terms of DNA mechanics, he said. "It expands our toolbox for designing molecular machines." The research is ultimately aimed at making nanorobotics practical, according to Seeman. "It could be used to configure a molecular pegboard or control molecular assemblers. The ability to achieve many different shapes means that you can create many different patterns; different patterns in a timed sequence are the essence of a machine or robot," he said. Molecular machines could be used to assemble drugs molecule-by-molecule, and molecular robots may eventually work inside the human body. It will be about a decade before the method can be used to make practical devices, said Seeman. Seeman's research colleagues were Hao Yan, Xiaoping Zhang and Zhiyong Shen. They published the research in the January 3, 2002 issue of Nature. The research was funded by the National Science Foundation (NSF), Office of Naval Research (ONR), the National Institutes of Health (NIH) and the Defense Advanced Research Projects Agency (DARPA). Timeline: 10 years TRN Categories: Biological, Chemical, DNA and Molecular Computing; Nanotechnology Story Type: News Related Elements: Technical paper, "A Robust DNA Mechanical Device Controlled by Hybridization Topology," Nature, January 3, 2002. January 16, 2002 Morphing DNA makes motor Toolset teams computers to design drugs Atom clouds ease quantum computing Web pages cluster by content type Quantum effect alters device motion Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog | Books Buy an ad link Ad links: Clear History Buy an ad link © Copyright Technology Research News, LLC 2000-2006. All rights reserved.
<urn:uuid:1632c68a-5746-4537-9e0b-ad350e4f96b7>
CC-MAIN-2014-49
http://www.trnmag.com/Stories/2002/011602/Morphing_DNA_makes_motor_011602.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931012025.85/warc/CC-MAIN-20141125155652-00197-ip-10-235-23-156.ec2.internal.warc.gz
en
0.918928
1,077
3.84375
4
In a recent experiment, scientists were able to observe quasiparticles propagating across a string of ions, creating waves of quantum entanglement in their wake. Experiments like this one, which study systems with multiple quantum bodies, are crucial to learning about the behavior of quasiparticles and their interactions with more traditional particles. It’s tempting to think that quasiparticles are not particles at all. Quasiparticles are “objects” that emerge within a complex system, such as a solid object. The collective behavior of the particles in the solid can create the impression of a new particle. The impression—or quasiparticle—moves through the solid as if it were a real particle moving through empty space, and it behaves according to the same rules. Nevertheless, within their system, quasiparticles can have real effects on their environment. Most recently, scientists were able to track the propagation of quasiparticles called magnons through a collection of atoms. Now, scientists have been able to watch as that propagation changed the behavior of these atoms. And in the process, the quasiparticles reached speeds where a conventional model, which we use to understand time, breaks down. To make these observations, the researchers lined up seven ions and targeted the fourth ion, exactly in the middle of the line, with a laser. The laser changes the ion’s quantum spin direction. Changing the spin of the fourth (middle) ion sends out quasiparticles in both directions, much in the same way that a pebble, dropped into a pond, sends out a ripple in all directions. In this case, the "quasiparticle" was essentially a wave of altered spin states. Before beginning the experiment, all ions had the same spin direction. But once the first ion’s spin had been reversed, it quickly changed the spins of the two ions that flanked it, starting a chain reaction—a wave, or quasiparticle, moving in each direction. The quasiparticles generated are called magnons. As the two magnons moved away from the middle of the line, entanglement moved with them. That is, as the magnon moving to the right passed over ion 5, and as the one moving to the left passed over ion 3, ions 3 and 5 became entangled with each other. The scientists were able to measure how the entanglement changed with time as the two magnons propagated away from each other. Their results agreed very closely with prediction—pairs of ions were briefly measured to be entangled as the pair of magnons moved over them, and then ceased to be entangled once the magnon was gone. The experiment also had a second layer. The scientists were able to “tune” the range of interactions between the ions in the system. In other words, they could adjust how far one ion’s influence on its neighbors reaches. In the first part of the experiment, each ion’s spin essentially only influenced its immediate neighbors’. In the second, the researchers were able to adjust it so that the ions’ spin can jump over adjacent ions, changing the spins of more distant ones. The resulting collective behavior of the ions still produced quasiparticles, but quasiparticles of a different sort, moving at a different speed. As they tuned the system to three different interaction ranges, the quasiparticles became faster and faster, ultimately approaching infinite speed. Actual infinite speed is not possible, even in a quantum system, due to a speed limit called the Lieb-Robinson bound. The actual top speed of a quasiparticle may vary depending on the system it inhabits, but it is always finite. However, according to the researchers, the Lieb-Robinson bounds are “trivial” in certain circumstances, such as in their tuned system, meaning that there’s essentially no restriction on the speeds of the quasiparticles under certain circumstances. The unbounded speed also breaks down conventional notions of time. A relativistic model called the light-cone is often used to understand time. Light-cones are graphs of the furthest light beams that can reach an object given a certain time. Nothing can travel faster than the speed of light, so only objects in the “past” part of an object’s light-cone can possibly transfer information to that object. This model holds in the first part of the experiment, but once the researchers had tuned the interaction ranges of the ions, they found that the speeds of the quasiparticles were such that they could no longer be described in terms of light-cones. The experiment is significant not only for its findings, which agree closely with prediction (and are the first time entanglement due to quasiparticles has been observed), but also because it lays the groundwork for many future avenues of study. Experiments like this one, involving many-body systems, are crucial to our understanding of a wide range of quantum phenomena.
<urn:uuid:bc0ba4d7-5b43-4482-8106-4e2eb651f530>
CC-MAIN-2014-49
http://arstechnica.com/science/2014/07/quasiparticles-carry-entanglement-to-near-infinite-speeds/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400380464.40/warc/CC-MAIN-20141119123300-00200-ip-10-235-23-156.ec2.internal.warc.gz
en
0.959517
1,047
4.125
4
Alice and Bob Alice and Bob are two commonly used placeholder names. They are used for archetypal characters in fields such as cryptography and physics. The names are used for convenience; for example, "Alice sends a message to Bob encrypted with his public key" is easier to follow than "Party A sends a message to Party B encrypted by Party B's public key." Following the alphabet, the specific names have evolved into common parlance within these fields—helping technical topics to be explained in a more understandable fashion. These placeholder names are used for convenience and easier understanding. For example, if a writer wants to explain encrypted emails, the explanation might be: - 1. Alice gets Bob's public key from the company directory. - 2. Alice sends a message to Bob encrypted with Bob's public key. - 3. Bob can use his secret key to unscramble it. Every reader can intuitively figure out that they themselves could do the same thing as Bob or Alice. Following the alphabet, the specific names have evolved into common parlance within these fields—helping technical topics to be explained in a more understandable fashion. - 4. Then Dave decrypts the email he got, and gives a copy to Gena. - 5. Then Erin decrypts the email she got, and gives a copy to Heather. In cryptography and computer security, there are a number of widely used names for the participants in discussions and presentations about various protocols. The names are conventional, somewhat self-suggestive, sometimes humorous, and effectively act as metasyntactic variables. In typical implementations of these protocols, it is understood that the actions attributed to characters such as Alice or Bob need not always be carried out by human parties directly, but also by a trusted automated agent (such as a computer program) on their behalf. Cast of characters This list is drawn mostly from the book Applied Cryptography by Bruce Schneier. Alice and Bob are archetypes in cryptography; Eve is also common. Names further down the alphabet are less common. - Alice and Bob. Generally, Alice wants to send a message to Bob. These names were used by Ron Rivest in the 1978 Communications of the ACM article presenting the RSA cryptosystem, and in A Method for Obtaining Digital Signatures and Public-Key Cryptosystems published April 4, 1977, revised September 1, 1977, as technical Memo LCS/TM82. Rivest denies that these names have any relation to the 1969 movie Bob & Carol & Ted & Alice, as occasionally suggested by others. - Carol, Carlos or Charlie, as a third participant in communications. - Chuck, as a third participant usually of malicious intent. - Craig, the password cracker (usually encountered in situations with stored hashed/salted passwords). - Dan or Dave, a fourth participant. - Erin, a fifth participant. (It's rare to see Erin; E is usually reserved for Eve.) - Eve, an eavesdropper, is usually a passive attacker. While she can listen in on messages between Alice and Bob, she cannot modify them. In quantum cryptography, Eve may also represent the environment. - Frank, a sixth participant (and so on alphabetically). - Mallet or Mallory, a malicious attacker (less commonly called Trudy, an intruder.); unlike the passive Eve, this one is the active man-in-the-middle attacker who can modify messages, substitute his/her own messages, replay old messages, and so on. The difficulty of securing a system against Mallet/Mallory is much greater than against Eve. - Oscar, an opponent, similar to Mallet/Mallory but not necessarily malicious. Could be white-hat but still wants to crack, modify, substitute, or replay messages. - Peggy, a prover, and Victor, a verifier, often must interact in some way to show that the intended transaction has actually taken place. They are often found in zero-knowledge proofs. Alternate names for the prover and the verifier are Pat and Vanna after Pat Sajak and Vanna White, the hosts of Wheel of Fortune. - Sybil, an attacker who marshals a large number of pseudonymous identities, e.g. to subvert a reputation system. See Sybil attack. - Trent, a trusted arbitrator, is some kind of neutral third party, whose exact role varies with the protocol under discussion. - Walter, a warden, may be needed to guard Alice and Bob in some respect, depending on the protocol being discussed. - Wendy, a whistleblower, is an insider threat with privileged information. Although an interactive proof system is not quite a cryptographic protocol, it is sufficiently related to mention the cast of characters its literature features: - Arthur and Merlin: In interactive proof systems, the prover has unbounded computational ability and is hence associated with Merlin, the powerful wizard. He claims the truth of a statement, and Arthur, the wise king, questions him to verify the claim. These two characters also give the name for two complexity classes, namely MA and AM. - A similar pair of characters is Paul and Carole. The characters were introduced in the solution of the Twenty Questions problem, where "Paul", who asked questions, stood for Paul Erdős and "Carole", who answered them, was an anagram of "oracle". They were further used in certain combinatorial games in the roles of Pusher and Chooser respectively, and have since been used in various roles. - Newton, David E. (1997). Encyclopedia of Cryptography. Santa Barbara California: Instructional Horizons, Inc. p. 10. - RFC 4949 - "Security's inseparable couple". Network World. February 7, 2005. - Tanenbaum, Andrew S. (2007), Distributed Systems: Principles and Paradigms, Pearson Prentice Hall, p. 171;399–402, ISBN 978-0-13-239227-3 - Bruce Schneier (1994), Applied Cryptography: Protocols, Algorithms, and Source Code in C, Wiley, ISBN 9780471597568, p. 44: "Mallet can intercept Alice's database inquiry, and substitute his own public key for Alice's. He can do the same to Bob." - Charles L. Perkins et al. (2000), Firewalls: 24seven, Network Press, ISBN 9780782125290, p. 130: "Mallet maintains the illusion that Alice and Bob are talking to each other rather than to him by intercepting the messages and retransmitting them." - Brian LaMacchia (2002), .NET Framework Security, Addison-Wesley, ISBN 9780672321849, p. 616: "Mallet represents an active adversary that not only listens to all communications between Alice and Bob but can also modify the contents of any communication he sees while it is in transit." - Shlomi Dolev, ed. (2009), Algorithmic Aspects of Wireless Sensor Networks, Springer, ISBN 9783642054334, p. 67: "We model key choices of Alice, Bob and adversary Mallet as independent random variables A, B and M [...]" - Bruce Schneier (1996), Applied Cryptography: Protocols, Algorithms, and Source Code in C, Second Edition, Wiley, ISBN 9780471117094, p. 23: Table 2.1: Dramatis Personae - Carsten Lund et al. (1992). "Algebraic Methods for Interactive Proof Systems". J. ACM (ACM) 39 (4): 859–868. doi:10.1145/146585.146605. - Spencer, Joel; Winkler, Peter (1992), Three Thresholds for a Liar, Combinatorics, Probability and Computing 1 (01): 81–93, doi:10.1017/S0963548300000080 - Muthukrishnan, S. (2005), Data Streams: Algorithms and Applications, Now Publishers, p. 3, ISBN 978-1-933019-14-7 - C.H. Lindsey, Regulation of Investigatory Powers Bill: Some Scenarios, 2000 - A Method for Obtaining Digital Signatures and Public-Key Cryptosystems - The Alice and Bob After-Dinner Speech, given at the Zurich Seminar, April 1984, by John Gordon - Geek Song: "Alice and Bob" - Alice and Bob jokes (mainly Quantum Computing-related) - Alice and Bob: IT's inseparable couple - A short history of Bobs (story and slideshow) in the computing industry, from Alice & Bob to Microsoft Bob and Father of Ethernet Bob Metcalfe - Alice and Bob en Français
<urn:uuid:85658086-969c-4978-aab4-19a790d91c1f>
CC-MAIN-2014-49
http://en.wikipedia.org/wiki/Placeholder_names_in_cryptography
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931003959.7/warc/CC-MAIN-20141125155643-00090-ip-10-235-23-156.ec2.internal.warc.gz
en
0.874363
1,845
3.515625
4
First Electronic Quantum Processor Created 2009 07 01 A team led by Yale University researchers has created the first rudimentary solid-state quantum processor, taking another step toward the ultimate dream of building a quantum computer. The two-qubit processor is the first solid-state quantum processor that resembles a conventional computer chip and is able to run simple algorithms. (Credit: Blake Johnson/Yale University) They also used the two-qubit superconducting chip to successfully run elementary algorithms, such as a simple search, demonstrating quantum information processing with a solid-state device for the first time. Their findings appeared in Nature's advanced online publication June 28. "Our processor can perform only a few very simple quantum tasks, which have been demonstrated before with single nuclei, atoms and photons," said Robert Schoelkopf, the William A. Norton Professor of Applied Physics & Physics at Yale. "But this is the first time they've been possible in an all-electronic device that looks and feels much more like a regular microprocessor." Working with a group of theoretical physicists led by Steven Girvin, the Eugene Higgins Professor of Physics & Applied Physics, the team manufactured two artificial atoms, or qubits ("quantum bits"). While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states. These states are akin to the "1" and "0" or "on" and "off" states of regular bits employed by conventional computers. Because of the counterintuitive laws of quantum mechanics, however, scientists can effectively place qubits in a "superposition" of multiple states at the same time, allowing for greater information storage and processing power. For example, imagine having four phone numbers, including one for a friend, but not knowing which number belonged to that friend. You would typically have to try two to three numbers before you dialed the right one. A quantum processor, on the other hand, can find the right number in only one try. "Instead of having to place a phone call to one number, then another number, you use quantum mechanics to speed up the process," Schoelkopf said. "It's like being able to place one phone call that simultaneously tests all four numbers, but only goes through to the right one." These sorts of computations, though simple, have not been possible using solid-state qubits until now in part because scientists could not get the qubits to last long enough. While the first qubits of a decade ago were able to maintain specific quantum states for about a nanosecond, Schoelkopf and his team are now able to maintain theirs for a microsecond—a thousand times longer, which is enough to run the simple algorithms. To perform their operations, the qubits communicate with one another using a "quantum bus"—photons that transmit information through wires connecting the qubits—previously developed by the Yale group. The key that made the two-qubit processor possible was getting the qubits to switch "on" and "off" abruptly, so that they exchanged information quickly and only when the researchers wanted them to, said Leonardo DiCarlo, a postdoctoral associate in applied physics at Yale's School of Engineering & Applied Science and lead author of the paper. Next, the team will work to increase the amount of time the qubits maintain their quantum states so they can run more complex algorithms. They will also work to connect more qubits to the quantum bus. The processing power increases exponentially with each qubit added, Schoelkopf said, so the potential for more advanced quantum computing is enormous. But he cautions it will still be some time before quantum computers are being used to solve complex problems. "We're still far away from building a practical quantum computer, but this is a major step forward." Authors of the paper include Leonardo DiCarlo, Jerry M. Chow, Lev S. Bishop, Blake Johnson, David Schuster, Luigi Frunzio, Steven Girvin and Robert Schoelkopf (all of Yale University), Jay M. Gambetta (University of Waterloo), Johannes Majer (Atominstitut der Österreichischen Universitäten) and Alexandre Blais (Université de Sherbrooke). Article source: ScienceDaily.com Jim Elvidge - Programmed Reality, The Power of 10, Science & The Soul Nick Begich - Mind Control & Emerging Technologies A short Introduction to Quantum Computation Is Quantum Mechanics Controlling Your Thoughts? How Time-Traveling Could Affect Quantum Computing Nano-Diamonds Might Lead to Quantum Computing 'Light trap' is a Step Towards Quantum Memory Latest News from our Front Page Why Can’t We Publish Addresses Of New York Times Reporters? 2014 11 28 New York Times reporters Julie Bosman and Campbell Robertson published the address of Darren Wilson in the New York Times so here are their addresses. GotNews.com strenuously objects to publishing the addresses of individuals who are being targeted with death threats. GotNews.com published the address of Ebola patient Nina Pham so that people could avoid going to her Dallas apartment. But it would ... Terrorists? Interview with Varg Vikernes and Marie Cachet 2014 11 28 Marie Cachet and Varg Vikernes are what we call commonly ordinary people. However, for motives meanly political, the Ministry of the Interior decided to abuse its power to damage them ; "there is nothing more annoying than a low man placed in high position." (Roman saying) Today, Varg risks the eviction of the French territory without valid ground. Three very ... The Coudenhove-Kalergi Plan - The Genocide Of The People Of Europe 2014 11 28 Mass immigration is a phenomenon, the causes of which are still cleverly concealed by the system, and the multicultural propaganda is trying to falsely portray it as inevitable. With this article we intend to prove once and for all, that this is not a spontaneous phenomenon. What they want to present as an inevitable outcome of modern life, is actually ... Starbucks Supports Pro-GMO Company 2014 11 26 Another reason why you should not go to Starbucks. Starbucks has an image of being a socially responsible, environmentally friendly company (Really?). In 2013, 95 percent of their coffee was ethically sourced, and their goal is to reach 100 percent by 2015.1 Other goals include reducing water consumption by 25 percent in their company-operated stores by 20152 and mobilizing their employees and ... Group Polarization and the Fad of Ethno-masochism 2014 11 26 From "Group polarization: A critical review and meta-analysis". Journal of Personality and Social Psychology. 6 50 (6): 1141--1151 The psychology of White self hatred. Political correctness IS a mental disorder. Group polarization: A critical review and meta-analysis. Isenberg, Daniel J. the paper Harvard Professor Noel Ignatiev talks about how to end the White race The History of Political Correctness The Narrative: The origins of Political ... |More News » |
<urn:uuid:97ad487a-6c71-489f-af89-4b35908fc8d9>
CC-MAIN-2014-49
http://www.redicecreations.com/article.php?id=6996
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931012025.85/warc/CC-MAIN-20141125155652-00217-ip-10-235-23-156.ec2.internal.warc.gz
en
0.915734
1,495
3.859375
4
More precisely, quantum teleportation is a quantum protocol by which a qubit a (the basic unit of quantum information) can be transmitted exactly (in principle) from one location to another. The prerequisites are a conventional communication channel capable of transmitting two classical bits (i.e. one of four states), and an entangled pair (b,c) of qubits, with b at the origin and c at the destination. (So whereas b and c are intimately related, a is entirely independent of them other than being initially colocated with b.) The protocol has three steps: measure a and b jointly to yield two classical bits; transmit the two bits to the other end of the channel (the only potentially time-consuming step, due to speed-of-light considerations); and use the two bits to select one of four ways of recovering c. The upshot of this protocol is to permute the original arrangement ((a,b),c) to ((b′,c′),a), that is, a moves to where c was and the previously separated qubits of the Bell pair turn into a new Bell pair (b′,c′) at the origin. Suppose Alice has a qubit in some arbitrary quantum state . Assume that this quantum state is not known to Alice and she would like to send this state to Bob. Ostensibly, Alice has the following options: Option 1 is highly undesirable because quantum states are fragile and any perturbation en route would corrupt the state. The unavailability of option 2 is the statement of the no-broadcast theorem. Similarly, it has also been shown formally that classical teleportation, aka. option 3, is impossible; this is called the no teleportation theorem. This is another way to say that quantum information cannot be measured reliably. Thus, Alice seems to face an impossible problem. A solution was discovered by Bennet et al. (see reference below.) The parts of a maximally entangled two-qubit state are distributed to Alice and Bob. The protocol then involves Alice and Bob interacting locally with the qubit(s) in their possession and Alice sending two classical bits to Bob. In the end, the qubit in Bob's possession will be in the desired state. Alice applies a unitary operation on the qubits AC and measures the result to obtain two classical bits. In this process, the two qubits are destroyed. Bob's qubit, B, now contains information about C; however, the information is somewhat randomized. More specifically, Bob's qubit B is in one of four states uniformly chosen at random and Bob cannot obtain any information about C from his qubit. Alice provides her two measured qubits, which indicate which of the four states Bob possesses. Bob applies a unitary transformation which depends on the qubits he obtains from Alice, transforming his qubit into an identical copy of the qubit C. Suppose Alice has a qubit that she wants to teleport to Bob. This qubit can be written generally as: Alice takes one of the particles in the pair, and Bob keeps the other one. The subscripts A and B in the entangled state refer to Alice's or Bob's particle. We will assume that Alice and Bob share the entangled state . So, Alice has two particles (C, the one she wants to teleport, and A, one of the entangled pair), and Bob has one particle, B. In the total system, the state of these three particles is given by Alice will then make a partial measurement in the Bell basis on the two qubits in her possession. To make the result of her measurement clear, we will rewrite the two qubits of Alice in the Bell basis via the following general identities (these can be easily verified): The three particle state shown above thus becomes the following four-term superposition: Notice all we have done so far is a change of basis on Alice's part of the system. No operation has been performed and the three particles are still in the same state. The actual teleportation starts when Alice measures her two qubits in the Bell basis. Given the above expression, evidently the results of her (local) measurement is that the three-particle state would collapse to one of the following four states (with equal probability of obtaining each): Alice's two particles are now entangled to each other, in one of the four Bell states. The entanglement originally shared between Alice's and Bob's is now broken. Bob's particle takes on one of the four superposition states shown above. Note how Bob's qubit is now in a state that resembles the state to be teleported. The four possible states for Bob's qubit are unitary images of the state to be teleported. The crucial step, the local measurement done by Alice on the Bell basis, is done. It is clear how to proceed further. Alice now has complete knowledge of the state of the three particles; the result of her Bell measurement tells her which of the four states the system is in. She simply has to send her results to Bob through a classical channel. Two classical bits can communicate which of the four results she obtained. After Bob receives the message from Alice, he will know which of the four states his particle is in. Using this information, he performs a unitary operation on his particle to transform it to the desired state : to recover the state. to his qubit. Teleportation is therefore achieved. Experimentally, the projective measurement done by Alice may be achieved via a series of laser pulses directed at the two particles. In the literature, one might find alternative, but completely equivalent, descriptions of the teleportation protocol given above. Namely, the unitary transformation that is the change of basis (from the standard product basis into the Bell basis) can also be implemented by quantum gates. Direct calculation shows that this gate is given by Entanglement can be applied not just to pure states, but also mixed states, or even the undefined state of an entangled particle. The so-called entanglement swapping is a simple and illustrative example. If Alice has a particle which is entangled with a particle owned by Bob, and Bob teleports it to Carol, then afterwards, Alice's particle is entangled with Carol's. A more symmetric way to describe the situation is the following: Alice has one particle, Bob two, and Carol one. Alice's particle and Bob's first particle are entangled, and so are Bob's second and Carol's particle: Alice-:-:-:-:-:-Bob1 -:- Bob2-:-:-:-:-:-Carol Now, if Bob performs a projective measurement on his two particles in the Bell state basis and communicates the results to Carol, as per the teleportation scheme described above, the state of Bob's first particle can be teleported to Carol's. Although Alice and Carol never interacted with each other, their particles are now entangled. One can imagine how the teleportation scheme given above might be extended to N-state particles, i.e. particles whose states lie in the N dimensional Hilbert space. The combined system of the three particles now has a dimensional state space. To teleport, Alice makes a partial measurement on the two particles in her possession in some entangled basis on the dimensional subsystem. This measurement has equally probable outcomes, which are then communicated to Bob classically. Bob recovers the desired state by sending his particle through an appropriate unitary gate. A general teleportation scheme can be described as follows. Three quantum systems are involved. System 1 is the (unknown) state ρ to be teleported by Alice. Systems 2 and 3 are in a maximally entangled state ω that are distributed to Alice and Bob, respectively. The total system is then in the state where Tr12 is the partial trace operation with respect systems 1 and 2, and denotes the composition of maps. This describes the channel in the Schrödinger picture. Taking adjoint maps in the Heisenberg picture, the success condition becomes for all observable O on Bob's system. The tensor factor in is while that of is . The proposed channel Φ can be described more explicitly. To begin teleportation, Alice performs a local measurement on the two subsystems (1 and 2) in her possession. Assume the local measurement have effects If the measurement registers the i-th outcome, the overall state collapses to The tensor factor in is while that of is . Bob then applies a corresponding local operation Ψi on system 3. On the combined system, this is described by where Id is the identity map on the composite system . Therefore the channel Φ is defined by Notice Φ satisfies the definition of LOCC. As stated above, the teleportation is said to be successful if, for all observable O on Bob's system, the equality holds. The left hand side of the equation is: where Ψi* is the adjoint of Ψi in the Heisenberg picture. Assuming all objects are finite dimensional, this becomes The success criterion for teleportation has the expression
<urn:uuid:efe780ea-7ff1-4da7-8b9a-e27579d8254a>
CC-MAIN-2014-49
http://www.reference.com/browse/quantum+teleportation
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400380233.64/warc/CC-MAIN-20141119123300-00034-ip-10-235-23-156.ec2.internal.warc.gz
en
0.939057
1,865
3.765625
4
On December 17, 1903, at Kitty Hawk, North Carolina, the 1903 Wright Flyer became the first powered, heavier-than-air machine to achieve controlled, sustained flight with a pilot aboard. On this date in 1969, Neil Armstrong, aboard the Apollo 11 Lunar Lander, along with Buzz Aldrin, touched down on the surface of the moon. Michael Collins waited aboard the Command Module, orbiting the moon. You run and you run to catch up with the sun but it's sinking Racing around to come up behind you again. The sun is the same in a relative way but you're older, Shorter of breath and one day closer to death. "Time" by Mason, Waters, Wright, Gilmour Through the insights of the brightest minds, we begin to see evidence of Tachyons, particles that travel backwards in time, and through experimentation with the LHC, we begin to see elementary particles. Mind you, the theory was a proper theory in the sense that it was mathematically consistent, and also because it predicted certain observable consequences-namely, that if tachyons existed they would emit a certain type of radiation (Cerenkov radiation) in a vacuum. This radiation was searched for, and none was found. So, after a flurry of excitement, physicists lost interest in tachyons and went on to more massive hypotheses, such as black holes. As far as physicists are concerned, tachyons do not exist. (Committee for Skeptical Inquiry) There is a Canadian company called D-Wave which has the first commercially available quantum computer, and is set to release a 512 qbit version by the end of this year. Add Artifical Intelligence to this computing capability, and the possibilities are mind-boggling. Traditional computer process information as bits that can be a 0 or a 1. Quantum computers utilize the potential of quantum mechanics by making its bits a 0, a 1, or a 0 and a 1 simultaneously. This “superposition” lets it do many calculations at once, where a traditional computer can only perform one. It would appear the only restrictions on this technology are the limitations imposed by the speed of light, but future experiments may produce instantaneous transfers. The experiment was carried out by scientists at Hefei National Laboratory for Physical Sciences in Anhui, China. During its course, the scientists took two quantum entangled particles. One was sent to a distant quantum memory node while the other was present at the lab. The scientists then altered the state of the photon in the lab and it directly affected the state of the distant photon. This is a very exciting development for the world of quantum computing as well as those researching on faster modes of communication transmission. If indeed this progress can be translated into greater, more sophisticated system, it would mean that we can create the fastest data-transmission machines in the near future. Will it be possible in the future to be placed inside one of these machines, have your whole body mapped into digital form, and transmitted by Quantum Tunneling to another location in space and time? An MRI system can create axial images as well as sagitall (slicing the bread side-to-side lengthwise) and coronal (think of the layers in a layer cake) images, or any degree in between, without the patient ever moving. While it is true that a Tachyon is a putative particle, at one point in time the Higgs Boson was also just theoretical particle, until recent discovery by the LHC at Cern. The detection of a Tachyon would by it's very nature be very difficult to prove, a particle that resides just above the speed of light. Absolute, true and mathematical time, of itself, and from its own nature, flows equably without relation to anything external, and by another name is called duration: relative, apparent and common time, is some sensible and external (whether accurate of unequable) measure of duration by the means of motion, which is commonly used instead of true time; such as an hour, a day, a month, a year. (Isaac Newton, cited in Philosophy of Physics: Space and Time by Tim Maudlin, pg 13.) Space and time are the framework within which the mind is constrained to construct its experience of reality. (Immanuel Kant) he seems to be describing a "Star Trek" transporter, not a time machine However, no such radiation was discovered by any test, so the uniform conclusion of physicists is that tachyons do not exist we have invented measurement of space in order to be able to quantify distances, we have invented measurement of time in order to be able to quantify durations. By this perspective, time is not really anything -- it is merely the intellectual imposition of order. This theory has a wide range of consequences which have been experimentally verified, including counter-intuitive ones such as length contraction, time dilation and relativity of simultaneity. 1. Once transmitted digitally, how is the body re-assembled into it's original biological form? 2. How does the body return, if the transmitting unit is at the origin point? In truth, a teleporter, which I'm sure my opponent has no problem with, would still transmit across time, but in such in a tighter field, and only moments into the future. To measure a particle traveling faster than the speed of light requires equipment that hasn't been devised yet. So then perhaps Einstein was wrong with Special Relativity, and if time is a mere sequential measurement, then odd effects such as time dilation cannot exist. 'First round both debaters opened very strongly, both using very good scientific logic and the steady progess of mankind as strong bases for an opener, while both maintaining a somewhat tongue-in-cheek approach to Hollywood’s portrayal of time travel and how easy it seems. For the first round, due to a better explanation of the cons of the realities of time travel, the round goes to Adjensen. Second round, Druid42 attempted to explain how we as humans could go about achieving the possibility of time travel, but as it would seem didn’t have the room to extrapolate the theory properly, instead delivering a more ‘teleportation’ based theory than one of time travel. Adjensen however in the second round recovered somewhat strongly, expanding on his previous post regarding the impossibilities of time manipulation and dimension given any forms of technology. Even though the information in the second round was only marginally more than in the first, the second round goes to Adjensen for a once again more concise reply. Last round Druid42 began to address the issues his opponent had raised, and gave some very strong retorts to Adjensen’s ideas in the second round. He recovered strongly by bringing the facts to the table and opening the possibility of how time travel could actually work. However, Adjensen had one last card to play, and this statement; The inevitable plot hole of any time travel story is that with such a device, anything in any time can be done. Screwed something up? No worries, just go back five minutes earlier and inform yourself of the error. That didn't work, either? Go back five minutes before that. Repeat until you get it right, because you always have five more minutes. Is very convincing to the con stance of time travel. A very difficult debate to judge given the complex nature of the topic, but adjensen is the winner on this one.' Although both contestants brought up valid points, I feel that Druid42 has prevailed. While Druid42 has shown that the future is full of possibilities and such a machine/concept is a future possibility, adjensen maintained a present sense of technology not looking toward the future of possibilities. Their arguments are based on what it is we know and understand today without giving leeway to what we may understand tomorrow. As we do not understand how such a machine would impact our past/future we can only argue against by the standards we now have the ability to grasp and I don't feel that they made a good enough argument to nullify the possibility as a future occurrence. wont profess to understand exactly what these two great debaters were talking about, but overall the case adjensen built seemed to be more coherent. I actually also learned something about the possibilities of time travel from adjensen who was debating against time travel. This part especially sold me to adjensens side: "Why is the timeframe of the receipt of a time machine of no relevance? Because once it exists, ever and anywhere, it exists always and everywhere. It doesn't matter if such a device isn't invented for a million billion years, because it takes away any limitations on where and when it can be". He did well in showing the logical inconsistences inherent in Druids side. What I liked best about Druids debate is the idea of first proving that instant movement of matter through space looks to become a possibility in the future and then extrapolating from that the idea of movement through time. He buffered this point with "Rose's Law", showing the exponential progress of mankind. This was a brilliant move that swayed me to his side of the debate for a short while. I would prefer Druids side to be true (wouldn't we all?) but at the end of the Debate I feel that adjensen made the slightly stronger case.
<urn:uuid:5e5b0534-0734-4a60-95b9-33a5f11a8a5c>
CC-MAIN-2014-49
http://www.abovetopsecret.com/forum/thread899903/pg
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931007510.17/warc/CC-MAIN-20141125155647-00058-ip-10-235-23-156.ec2.internal.warc.gz
en
0.963643
1,949
3.578125
4
Technologies that exploit the unique weirdness of quantum mechanics could debut in the very near future, thanks to the groundbreaking work of a huge European research consortium. Unbreakable cryptography, unimaginable simulations of profoundly complex problems and super-fast networks are just some of the promise held out by quantum computing. And now European scientists are poised to deliver on that promise, thanks to the work of the Qubit Applications (QAP) project. The integrated project has cherry-picked major obstacles in the path of quantum computing, problems that could have immediate applications and could command a ready market. Chief among them is quantum cryptography. “Quantum computing, when it arrives, could make all current cryptographic technology obsolete,” notes QAP co-coordinator Professor Ian Walmsley. Thankfully, researchers have developed quantum cryptography to deal with that issue. “Quantum cryptography over short distances was demonstrated in a previous project,” explains Walmsley. “The problem is, it only works over a short distance.” Weaving entangled webs That is because quantum cryptography relies on entanglement. Entanglement is a concept that explains how two or more particles exhibit correlation – a relationship if you like – that would be impossible to explain unless you supposed that they belonged to the same entity, even though they might be separated by vast distance. Imagine you were playing a game of quantum coin flipping with a colleague: you are heads and the colleague tails. You are two distinct individuals, but if the coin comes up heads your colleague loses, and you win. There is a correlation between the coin tossing. Now, with a quantum coin, it is heads the colleague wins and tails you win at the same time. This is the extra bit that quantum mechanics gives us, and which we use in secure communications, suggests Walmsley. That explains, with a little inaccuracy, the concept of entanglement, and it is at the core of quantum key distribution, or QKD. It is far too complex to break quantum encryption by brute force, and it is immune to eavesdropping because, at the quantum level, the act of observing an object changes the object observed. It means that encryption is guaranteed by the laws of physics. The technique was demonstrated in Vienna 2008, but it works only over short distances. EU-funded QAP hopes to develop a quantum repeater that can maintain entanglement over large distances. It has already had considerable success up to the 200km range, and growing. Ideal information carrier Maintaining entanglement over long distances – so essential to QKD, but also communications and networks – is the most immediate and compelling application in the QAP programme, but it is far from the only one. Many other areas of work show signs of progress, too. Storage and memory are essential for quantum computing. It is not too difficult to encode a piece of information on a photon, which is an ideal information carrier because of its high speed and weak interaction with the environment. It is difficult to store that information for any length of time, so QAP is developing ways of transferring quantum information from photons to and from atoms and molecules for storage, and the project is making steady progress. Similarly, QAP’s work to develop quantum networks is progressing well. One team within the overall research effort has managed to develop a reliable way to calibrate and test detectors, a prime element in the network system. “This is important because it will be essential to develop reliable methods to test results if work on quantum networks is to progress,” notes Walmsley. The research group has submitted a patent application for this work. Quantum simulation, too, offers some tantalising opportunities. The primary goal of QAP’s Quantum Simulations and Control subproject is to develop and advance experimental systems capable of simulating quantum systems whose properties are not approachable on classical computers. Imagine, for example, trying to model superconducting theory. It is hugely complex, and classic computers are quickly overwhelmed by the size of the problem. But quantum methods are inherently capable of dealing with far greater complexity, because of the nature of the qubit, or quantum bit. Classical, digital bits operate on the basis of on or off, yes or no. But quantum bits can be yes, no, or both. It takes classical computing from 2D, into the 3D information world. One could say that, while classical computers attack problems linearly, quantum computers attack problems exponentially. As a result, with just a few qubits, it is possible to do incredibly large computations, and that means that quantum simulation of complex problems could be a medium-term application. “We are not saying we will solve all the problems in the area of simulation, but we will make a good start,” warns Walmsley. That defines QAP nicely: a kick-start for quantum applications in Europe. The QAP project received funding from the ICT strand of the EU’s Sixth Framework Programme for research. Cite This Page:
<urn:uuid:574b6834-72e8-46c4-a94e-d34f51592ab9>
CC-MAIN-2014-49
http://www.sciencedaily.com/releases/2009/06/090615152926.htm
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400380358.68/warc/CC-MAIN-20141119123300-00258-ip-10-235-23-156.ec2.internal.warc.gz
en
0.941921
1,051
3.5625
4
New evidence that plants get their energy using quantum entanglement 2014 01 20 By George Dvorsky | io9 Biophysicists theorize that plants tap into the eerie world of quantum entanglement during photosynthesis. But the evidence to date has been purely circumstantial. Now, scientists have discovered a feature of plants that cannot be explained by classical physics alone — but which quantum mechanics answers quite nicely. The fact that biological systems can exploit quantum effects is quite astounding. In a way, they’re like mini-quantum computers capable of scanning all possible options in order to choose the most efficient paths or solutions. For plants, this means the ability to make the most of the energy they receive and then deliver that energy from leaves with near perfect efficiency. But for this to work, plants require the capacity to work in harmony with the wild, wacky, and extremely small world of quantum phenomena. The going theory is that plants have light-gathering macromolecules in their cells that can transfer energy via molecular vibrations — vibrations that have no equivalents in classical physics. Most of these light-gathering macromolecules are comprised of chromophores attached to proteins. These macromolecules carry out the first step of photosynthesis by capturing sunlight and efficiently transferring the energy. Previous inquiries suggested that this energy is transferred in a wave-like manner, but it was a process that could still be explained by classical physics. In Perfect Quantum Harmony In the new study, however, UCL researchers identified a specific feature in biological systems that can only be predicted by quantum physics. The team learned that the energy transfer in the light-harvesting macromolecules is facilitated by specific vibrational motions of the chromophores. "We found that the properties of some of the chromophore vibrations that assist energy transfer during photosynthesis can never be described with classical laws, and moreover, this non-classical behaviour enhances the efficiency of the energy transfer," noted supervisor and co-author Alexandra Olaya-Castro in a statement. The vibrations in question are periodic motions of the atoms within a molecule. It’s similar to how an object moves when it’s attached to a spring. Sometimes, the energy of two vibrating chromophores match the energy difference between the electronic transitions of chromophores. The result is a coherent exchange of a single quantum of energy. Read the full article at: io9.com Google Gets Excited About Quantum Computing A Jewel at the Heart of Quantum Physics In New Quantum Experiment, Effect Happens Before Cause Sir Roger Penrose — The quantum nature of consciousness ’Quantum smell’: Making Scents of it All Nonlocality and Quantum Entanglement Latest News from our Front Page Starbucks Supports Pro-GMO Company 2014 11 26 Another reason why you should not go to Starbucks. Starbucks has an image of being a socially responsible, environmentally friendly company (Really?). In 2013, 95 percent of their coffee was ethically sourced, and their goal is to reach 100 percent by 2015.1 Other goals include reducing water consumption by 25 percent in their company-operated stores by 20152 and mobilizing their employees and ... Group Polarization and the Fad of Ethno-masochism 2014 11 26 From "Group polarization: A critical review and meta-analysis". Journal of Personality and Social Psychology. 6 50 (6): 1141--1151 The psychology of White self hatred. Political correctness IS a mental disorder. Group polarization: A critical review and meta-analysis. Isenberg, Daniel J. the paper Harvard Professor Noel Ignatiev talks about how to end the White race The History of Political Correctness The Narrative: The origins of Political ... Credo: A Nietzschean Testament by Jonathan Bowden 2014 11 26 This lecture by Jonathan Bowden was given at the 11th New Right meeting in London on September 8, 2007. The original title of the presentation was “The Art and Philosophy of Jonathan Bowden.” I think ideas are inborn, and you’re attracted, if you have any, toward certain systems of thinking and sensibility and response. From a very young age, I was ... A Look Back at the OJ Simpson Verdict -- Reactions 2014 11 26 This is a look back at the different reactions to the OJ Simpson verdict some 20 years ago (exact date of verdict was Oct 3, 1995). The OJ Simpson jury consisted of 9 Blacks, 1 Hispanic, and 2 Whites. It would raise eyebrows after they only deliberated for 4 hours in a case that they were involved in for almost ... New York Times Publishes Darren Wilson’s Street Address and Photo of House #Ferguson 2014 11 26 Hey here are the two @nytimes scumbags that published Wilson’s home address. —> @juliebosman & @campbellnyt— Ben Howe (@BenHowe) November 25, 2014 Michael Brown’s Stepdad Shouting ‘Burn This Bitch Down’ The New York Times published information about the address of Ferguson Police Officer Darren Wilson on Monday in a move that has generated controversy. Tensions are running high in Ferguson, Missouri, as ... |More News » |
<urn:uuid:c482c82a-f382-404a-a62f-80bae1222c32>
CC-MAIN-2014-49
http://www.redicecreations.com/article.php?id=28567
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931008520.8/warc/CC-MAIN-20141125155648-00039-ip-10-235-23-156.ec2.internal.warc.gz
en
0.913021
1,113
3.703125
4
A possible application is the development of a super-fast computer and highly precise clocks that could be the future basis for a new standard of time Serge Haroche and David Wineland have opened the door to a new era of experimentation with quantum physics by demonstrating the direct observation of individual quantum systems without destroying them. Through their ingenious laboratory methods they have managed to measure and control very fragile quantum states, enabling their field of research to take the very first steps towards building a new type of super fast computer, based on quantum physics. These methods have also led to the construction of extremely precise clocks that could become the future basis for a new standard of time, with more than hundred-fold greater precision than present-day caesium clocks. For single particles of light or matter, the laws of classical physics cease to apply and quantum physics takes over. But single particles are not easily isolated from their surrounding environment and they lose their mysterious quantum properties as soon as they interact with the outside world. Both Laureates work in the field of quantum optics studying the fundamental interaction between light and matter. In David Wineland’s laboratory in Boulder, Colorado, electrically charged atoms or ions are kept inside a trap by surrounding them with electric fields. One of the secrets behind Wineland’s breakthrough is the mastery of the art of using laser beams and creating laser pulses. A laser is used to put the ion in its lowest energy state and thus enabling the study of quantum phenomena with the trapped ion. A carefully tuned laser pulse can be used to put the ion in a superposition state, which is a simultaneous existence of two distinctly different states. For instance, the quantum superposition of the ion’s energy states can be studied by using the laser pulse to nudge the ion halfway between the high- and low-energy levels. Controlling single photons Serge Haroche and his research group employ a different method to reveal the mysteries of the quantum world. In their laboratory in Paris microwave photons bounce back and forth inside a small cavity between two mirrors, about three centimetres apart. The mirrors are made of superconducting material and are cooled to a temperature just above absolute zero. These superconducting mirrors are so reflective that a single photon can bounce back and forth inside the cavity for almost a tenth of a second before it is lost or absorbed. During its long life time, many quantum manipulations can be performed with the trapped photon. Haroche uses specially prepared atoms, so-called Rydberg atoms to both control and measure the microwave photon in the cavity. A Rydberg atom has a radius of about 125 nanometres which is roughly 1,000 times larger than typical atoms. The Rydberg atoms are sent into the cavity one by one at a carefully chosen speed, so that the interaction with the microwave photon occurs in a well-controlled manner. The Rydberg atom traverses and exits the cavity, leaving the microwave photon behind. But the interaction between the photon and the atom creates a change in the phase of quantum state of the atom: if you think of the atom’s quantum state as a wave, the peaks and the dips of the wave become shifted. This phase shift can be measured when the atom exits the cavity, thereby revealing the presence or absence of a photon inside the cavity. With no photon there is no phase shift. Haroche can thus measure a single photon without destroying it. Physics in the quantum world has some inherent uncertainty or randomness to it. One example of this contrary behaviour is superposition, where a quantum particle can be in several different states simultaneously. Why do we never become aware of these strange facets of our world? Why can we not observe a superposition of quantum marble in our every-day life? The Austrian physicist and Nobel Laureate (Physics 1933) Erwin Schrödinger battled with this question. Like many other pioneers of quantum theory, he struggled to understand and interpret its implications. As late as 1952, he wrote: “We never experiment with just one electron or atom or (small) molecule. In thought-experiments we sometimes assume that we do; this invariably entails ridiculous consequences...” In order to illustrate the absurd consequences of moving between the micro-world of quantum physics and our every-day macro-world, Erwin Schrödinger described a thought experiment with a cat: Schrödinger’s cat is completely isolated from the outside world inside a box. The cat must be in a superposition state of being both dead and alive. The box also contains a bottle of deadly cyanide which is released only after the decay of some radioactive atom, also inside the box. The radioactive decay is governed by the laws of quantum mechanics, according to which the radioactive material is in a superposition state of both having decayed and not yet decayed. Therefore the cat must also be in a superposition state of being both dead and alive. Now, if you peek inside the box, you risk killing the cat because the quantum superposition is so sensitive to interaction with the environment that the slightest attempt to observe the cat would immediately ‘collapse’ the ‘cat-state’ to one of the two possible outcomes — dead or alive. Instead of Schrödinger’s cat, Haroche and Wineland trap quantum particles and put them in cat-like superposition states. These quantum objects are not really macroscopic as a cat, but they are still quite large by quantum standards. Inside Haroche’s cavity microwave photons are put in cat-like states with opposite phases at the same time, like a stopwatch with a needle that spins both clockwise and counterclockwise simultaneously. The microwave field inside the cavity is then probed with Rydberg atoms. The result is another unintelligible quantum effect called entanglement. Entanglement has also been described by Erwin Schrödinger and can occur between two or more quantum particles that have no direct contact but still can read and affect the properties of each other. Entanglement of the microwave field and Rydberg atoms allowed Haroche to map the life and death of the cat-like state inside his cavity, following it step by step, atom by atom, as it underwent a transition from the quantum superposition of states to a well defined state of classical physics. A possible application of ion traps that many scientists dream of is the quantum computer. In present-day classical computers the smallest unit of information is a bit that takes the value of either 1 or 0. In a quantum computer, however, the basic unit of information — a quantum bit or qubit — can be 1 and 0 at the same time. Two quantum bits can simultaneously take on four values — 00, 01, 10 and 11 — and each additional qubit doubles the amount of possible states. For n quantum bits there are 2 possible states, and a quantum computer of only 300 qubits could hold 2 values simultaneously. Wineland’s group was the first in the world to demonstrate a quantum operation with two quantum bits. Since control operations have already been achieved with a few qubits, there is no reason to believe that it should not be possible to achieve such operations with many more qubits. However, to build such a quantum computer one has to satisfy two opposing requirements: the qubits need to be adequately isolated from their environment in order not to destroy their quantum properties, yet they must also be able to communicate with the outside world in order to pass on the results of their calculations. David Wineland and his team of researchers have also used ions in a trap to build a clock that is a hundred times more precise than the caesium-based atomic clocks which are currently the standard for our measurement of time. Time is kept by setting, or synchronizing all clocks against one standard. Caesium clocks operate in the microwave range whereas Wineland’s ion clocks use visible light — hence their name: optical clocks. An optical clock can consist of just one ion or two ions in a trap. With two ions, one is used as the clock and the other is used to read the clock without destroying its state, or causing it to miss a tick. The precision of an optical clock is better than one part in 10 — if one had started to measure time at the beginning of the universe in the Big Bang about 14 billion years ago, the optical clock would only have been off by about five seconds today. With such precision, some extremely subtle and beautiful phenomena of nature have been observed, such as changes in the flow of time, or minute variations of gravity, the fabric of space-time. According to Einstein’s theory of relativity, time is affected by motion and gravity. The higher the speed and the stronger the gravity, the slower the passage of time. We may not be aware of these effects, but they have in fact become part of our everyday life. When we navigate with the GPS we rely on time signals from satellites with clocks that are routinely calibrated, because gravity is somewhat weaker several hundred kilometres altitude. With an optical clock it is possible to measure a difference in the passage of time when the clock’s speed is changed by less than 10 metres per second, or when gravity is altered as a consequence of a difference in height of only 30 centimetres. [Edited excerpts from “Popular Information” available at the Nobel Prize website]
<urn:uuid:f4eb257e-d525-4803-ab43-86d83d97953a>
CC-MAIN-2014-49
http://www.thehindu.com/sci-tech/science/methods-to-measure-manipulate-quantum-systems/article3985102.ece?ref=relatedNews
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400372743.62/warc/CC-MAIN-20141119123252-00135-ip-10-235-23-156.ec2.internal.warc.gz
en
0.944165
1,940
3.546875
4
SSL, or Secure Socket Layer, was first developed by Netscape in the mid-1990's to address the growing need to be able to securely transmit data. It protects data, verifies legitimacy of a website, and is supported by all major browsers. When you log into a banking website, your computer is sent a file called an "SSL certificate" which contains the following data: Based on the certificate's info, your browser decides whether or not to trust the certificate. This is possible because it uses third-party data, already in your browser, to confirm the certificate wasn't sent by a hacker. Once the certificate is received, the browser checks that the certificate was issued by a trusted third party known as a certificate authority. The browser then uses the public key to encrypt a random, symmetric encryption key and sends it to the server. The web server then decrypts the symmetric encryption key using its private key and uses the symmetric key to decrypt the URL and the HTTP data. Finally, the browser decrypts a response from the server using the symmetric key and displays the information. Due to the nature of the Internet, the path the content follows between a server and a web browser is not secure. There is always the possibility someone is using a "packet sniffer" to capture data as it passes through a network or, if you're wireless, right out of the air. This is where encryption comes in. Originally, SSL used 40-bit encryption, meaning the value of the key used to decrypt data was selected from 1 out of 1,099,511,627,776 possible values. Today, that level of encryption can be broken almost instantly; so, a 128-bit encryption is commonly used which means 340,282,366,920,938,463,463,374,607,431,768,211,456 possible values; increase it to 256 bits for more security and you have the theoretical number of atoms in the universe. Even with millions of today's top-of-the-line computers working together, brute-force decryption simply takes too long if data is encrypted properly. That said, it's always best to be paranoid because future technologies like quantum computing may render conventional encryption obsolete. If a brute-force attack won't work, how else can SSL be compromised? No matter how air-tight a security system is, all that work is pointless if users trusted with access have weak passwords or can be tricked into providing their passwords. Although not SSL-specific, it's vital best practices are used to prevent non-technical, "social engineering" attacks. There is also the possibility that browser and/or server flaws could be exploited. A good way to minimize the risk of a hacker taking advantage of exploits is to subscribe to twitter feeds or blogs related to web security. This way, vulnerabilities can be fixed shortly after they're made public. Another approach would be to establish a list of supported browsers so that you can block or redirect users whose browsers aren't secure. Flaws in SSL itself could potentially be identified and exploited. SSL supports multiple types of encryption and, in 2008, researchers were able to spoof a certificates by exploiting md5 encryption. This was done with an array of 200 PlayStation 3's and it was made possible because some certificate authorities relied on md5 alone. So, the reliability of an SSL certificate is directly related to the reliability of its certificate authority. If a certificate authority issues an SSL to a hacker's site, users could be fooled into thinking they are on a legitimate site due to successful SSL authentication. Furthermore, some authorities use better encryption methods than others. You can get a certificate from GoDaddy for $70/year or you can spend at least $695 at Symantec. Guess which business takes security more seriously! First, there's a yearly cost associated with SSL which must be weighed against the security benefit. Is there any data on the site that any hackers might use or is there any motivation for your site to be hacked more than another site? If you're doing financial transactions then you pretty much have to use SSL or users will not feel secure, not to mention it would be an obvious target for hackers. That said, if your site only contains openly shared data and is backed up regularly, the biggest risks might be that an admin's password could be captured or that users might use the same password on other sites that do contain sensitive data. SSL also uses additional server resources encrypting and decrypting content. Although the difference is minor due to processing power of today's servers, it can be noticeable on high-traffic sites. If you want to mix secure and non-secure content on the same page then users may get a browser warnings, so this limits the ability to host some content elsewhere; for example, a content distribution network. Finally, extra time is needed to purchase the certificate, set up the server, configure the website, and test. Sometimes SSL is a given, but it can be more of a qualitative question based on the balance between practicality and ideology. Yes, any unencrypted login is vulnerable to attack, but what are the chances? The best thing do is weigh the overall cost of SSL against how sensitive your content is and what might happen, worst case,if it is compromised. If you're not sure whether or not to use SSL but you have the money and don't see any major technical obstacles then go ahead and use it. A less expensive alternative might be to integrate a service like PayPal that handles authentication outside your website. On the other hand, if SSL's authentication and encryption aren't enough, consider using physical tokens. A physical token is a device that assists with authentication. For example, the device may periodically display a different value used to log in based on the current time. This approach removes the reliance in the certificate authority and allows more control over who has access. It can even be used to establish a VPN connection to the server before the website can be accessed. When configuring Drupal to use SSL, a good place to start is the Secure Pages modules which lets you define which pages are secure and handles redirects from or to secure pages as needed. If you're using Secure Pages with Drupal 6 then the Secure Pages Prevent Hijack module should be installed to prevent hijacked sessions from access SSL pages. Also, the Auth SSL Redirect module can be used to redirect authenticated users to SSL and it will work in conjunction with Secure Pages. If you're using Ubercart and want to either secure the whole site or just Ubercart pages then another option is Ubercart SSL and it can be extended to secure additional pages. In general, these modules help manage transitions between secure and insecure pages. [Updated based on comment feedback.] What do you think, what approaches do you recommend, and what do you recommend against?
<urn:uuid:00840c3b-5f87-4495-a089-db1f0f1fc21d>
CC-MAIN-2014-49
http://www.mediacurrent.com/blog/secure-authentication-and-drupal
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400379636.59/warc/CC-MAIN-20141119123259-00095-ip-10-235-23-156.ec2.internal.warc.gz
en
0.930926
1,392
4
4
Quantum computers should be much easier to build than previously thought, because they can still work with a large number of faulty or even missing components, according to a study published today in Physical Review Letters. This surprising discovery brings scientists one step closer to designing and building real-life quantum computing systems – devices that could have enormous potential across a wide range of fields, from drug design, electronics, and even code-breaking. Scientists have long been fascinated with building computers that work at a quantum level – so small that the parts are made of just single atoms or electrons. Instead of 'bits', the building blocks normally used to store electronic information, quantum systems use quantum bits or 'qubits', made up of an arrangement of entangled atoms. Materials behave very differently at this tiny scale compared to what we are used to in our everyday lives – quantum particles, for example, can exist in two places at the same time. "Quantum computers can exploit this weirdness to perform powerful calculations, and in theory, they could be designed to break public key encryption or simulate complex systems much faster than conventional computers," said Dr Sean Barrett, the lead author of the study, who is a Royal Society University Research Fellow in the Department of Physics at Imperial College London. The machines have been notoriously hard to build, however, and were thought to be very fragile to errors. In spite of considerable buzz in the field in the last 20 years, useful quantum computers remain elusive. Barrett and his colleague Dr. Thomas Stace, from the University of Queensland in Brisbane, Australia, have now found a way to correct for a particular sort of error, in which the qubits are lost from the computer altogether. They used a system of 'error-correcting' code, which involved looking at the context provided by the remaining qubits to decipher the missing information correctly. "Just as you can often tell what a word says when there are a few missing letters, or you can get the gist of a conversation on a badly-connected phone line, we used this idea in our design for a quantum computer," said Dr Barrett. They discovered that the computers have a much higher threshold for error than previously thought – up to a quarter of the qubits can be lost – but the computer can still be made to work. "It's surprising, because you wouldn't expect that if you lost a quarter of the beads from an abacus that it would still be useful," he added. The findings indicate that quantum computers may be much easier to build than previously thought, but as the results are still based on theoretical calculations, the next step is to actually demonstrate these ideas in the lab. Scientists will need to devise a way for scaling the computers to a sufficiently large number of qubits to be viable, says Barrett. At the moment the biggest quantum computers scientists have built are limited to just two or three qubits. "We are still some way off from knowing what the true potential of a quantum computer might be, says Barrett. "At the moment quantum computers are good at particular tasks, but we have no idea what these systems could be used for in the future," he said. "They may not necessarily be better for everything, but we just don't know. They may be better for very specific things that we find impossible now." For further information please contact: Research Media Relations Manager Imperial College London Telephone: +44 (0)207 594 8432 or ext. 48432 Out of hours duty Press Officer: +44 (0)7803 886 248 Notes to editors: 1. All are welcome to attend the lecture by Professor Alain Aspect of CNRS at Imperial College London from 17.30 – 18.30 on Thursday 11 November, "From Einstein's intuition to quantum bits: a new quantum age?" The lecture will be held in the Great Hall in the Sherfield Building on Imperial College London's South Kensington campus. Please email firstname.lastname@example.org for further information or to register to attend. 2 "Fault tolerant quantum computation with very high threshold for loss errors" Physical Review Letters 09 November 2010, to be published online at: 1500 London time (GMT) / 1000 US Eastern time Tuesday 9th November (no embargo) Link to paper on pre-print server: http://arxiv.org/abs/1005.2456 Corresponding author: Sean Barrett, Institute for Mathematical Sciences, Imperial College London. 3. Contact for Australian media: Dr Thomas Stace, Co-author (University of Queensland, Brisbane, Australia) Tel: +61 40 441 3069 4. Images are available for the media at: Credit: Sean Barrett and Thomas Stace. Caption: Illustration of the error correcting code used to demonstrate robustness to loss errors. Each dot represents a single qubit. The qubits are arranged on a lattice in such a way that the encoded information is robust to losing up to 25 percent of the qubits 5. The Royal Society is an independent academy promoting the natural and applied sciences. Founded in 1660, the Society has three roles, as the UK academy of science, as a learned Society, and as a funding agency. It responds to individual demand with selection by merit, not by field. As we celebrate our 350th anniversary in 2010, we are working to achieve five strategic priorities, to: 6. About Imperial College London: Consistently rated amongst the world's best universities, Imperial College London is a science-based institution with a reputation for excellence in teaching and research that attracts 14,000 students and 6,000 staff of the highest international quality. Innovative research at the College explores the interface between science, medicine, engineering and business, delivering practical solutions that improve quality of life and the environment - underpinned by a dynamic enterprise culture. Since its foundation in 1907, Imperial's contributions to society have included the discovery of penicillin, the development of holography and the foundations of fibre optics. This commitment to the application of research for the benefit of all continues today, with current focuses including interdisciplinary collaborations to improve global health, tackle climate change, develop sustainable sources of energy and address security challenges. In 2007, Imperial College London and Imperial College Healthcare NHS Trust formed the UK's first Academic Health Science Centre. This unique partnership aims to improve the quality of life of patients and populations by taking new discoveries and translating them into new therapies as quickly as possible. Website: www.imperial.ac.uk AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
<urn:uuid:10e53536-cc0f-4183-8b2f-941c264bc96e>
CC-MAIN-2014-49
http://www.eurekalert.org/pub_releases/2010-11/icl-qca110910.php
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931012025.85/warc/CC-MAIN-20141125155652-00239-ip-10-235-23-156.ec2.internal.warc.gz
en
0.927361
1,386
3.921875
4
Scientists Split Atom, Then Put It Back Together "Now that we have gained control of single neutral atoms trapped in laser fields, we would like to use atoms to perform a novel kind of information processing -- namely, the so-called quantum information processing," explained research team leader Andrea Alberti. "In essence, our atoms behave as a quantum bit, a qubit." 06/15/12 5:00 AM PT Mention the words, "splitting the atom," and most people will automatically think of nuclear fission, bombs and radioactivity. Recently, however, physicists at Germany's University of Bonn not only managed to "split" an atom in a different way -- using quantum mechanics -- but also put it back together again. "The fact that atoms, photons and molecules can be split at different locations is something already known," Andrea Alberti, team lead for the Bonn experiment and Alexander von Humboldt fellow at the Institut für Angewandte Physik, told TechNewsWorld. "What is really exciting is the level of quantum control and precision to which we pushed our system." The results of the experiment -- which has potential ramifications for quantum computing and beyond -- were published recently in the journal Proceedings of the National Academy of Sciences. Two Places at Once As part of this new experiment, which amounts to what's known as an "atom interferometer," scientists managed to keep a single atom simultaneously in two places at once separated by more than 10 micrometers, or one hundredth of a millimeter. Then, they were able to put it back together undamaged. "We are capable of trapping a single atom in a tiny box -- a box which is 0.020 micrometers in size and created by laser fields -- and subsequently split the atom into two boxes to reach separations up to 10 micrometers," Alberti explained. For an atom, 10 micrometers is an enormous distance. To put it in perspective, if the box were a glass of about 5 centimeters in diameter, say, then the atom's two parts would have been separated in two glasses 25 meters apart, he pointed out. The split was not directly visible, however. If you tried to take a picture, the atom would be seen in several images -- sometimes on the left, sometimes on the right, but never in both places. Nevertheless, it can be proven by putting the atom back together, the scientists noted. In addition, differences between the magnetic fields of the two positions or accelerations of the atom are discernible, since they become imprinted in the atom's quantum mechanical state. 'A Split Personality' Such quantum effects can only take place at the lowest temperatures and with careful handling. Specifically, the scientists involved used lasers to cool a cesium atom to a temperature of a tenth of a million degrees above absolute zero and then held it using another laser. Next, they took advantage of the fact that atoms have a spin that can go in two directions simultaneously. Essentially, if the atom is moved by the second laser to the right and the left at the same time, it will split. "The atom has kind of a split personality: half of it is to the right, and half to the left, and yet, it is still whole," explained Andreas Steffen, lead author on the publication describing the experiment. 'More Like a Cloud Than a Marble' Brain hurting yet? You're not alone. "If you think of an atom as being like a very small, very hard, very tough version of a marble or a ball bearing, then your thinking is trapped in a pre-1925 misconception," Daniel Styer, Schiffer Professor of physics at Oberlin College, told TechNewsWorld. "An atom can behave more like a cloud than a marble, although it doesn't behave exactly like either." Many people are familiar with the famous Schrödinger's cat thought experiment, in which a hypothetical cat exists both "alive" and "dead" at the same time. That experiment illustrates the difficulty of applying quantum mechanics to everyday objects. "In quantum mechanics, an atom doesn't have to have a position," Styer explained. "So if there are two routes to go from A to B, it is entirely possible for the atom to take both." What's known as the classic "double slit experiment" in physics gets at much the same notion. "Imagine a wall containing two small slits that are separated by a short distance," explained Jeanie Lau, an associate professor in the department of physics at the University of California at Riverside. "A particle in our everyday experience can go through only one of the slits, or bounce back," Lau told TechNewsWorld. A wave hitting the wall, however, will go through both slits, she pointed out. "In quantum mechanics, if the particle is small enough -- i.e., as small as an atom -- it can go through both slits and form interference patterns on the other side, just like a wave," Lau added. "Atoms, like waves, can interfere with each other, due to the particle-wave duality, a fundamental property of matter and a consequence of quantum mechanics." So, the Bonn experiment doesn't so much "split" the atom as it "uses the quantum mechanical nature of the particle -- that it can also behave like a wave -- to create interference by directing it to go through both slits," she explained. 'A Big Step Forward' It should be noted that atom interferometry -- or the process of "splitting" atoms and reassembling them -- "has been an active field of research since the 1930s, when it was first demonstrated," Andrew Cleland, professor of physics at the University of California at Santa Barbara, told TechNewsWorld. Indeed, "the ability to split a system into separate states and then bring them back together has long been one of the key aspects of quantum mechanics, and it has been shown experimentally in many circumstances," agreed Gary Felder, associate professor of physics at Smith College. "However, larger objects are harder to split and recombine in this way than smaller ones, and larger distances are harder than short ones," Felder told TechNewsWorld. "To split and recombine something as large as an atom over distances as great as tens of micrometers is a big step forward." Indeed, "only now, with this work from Bonn, have we had precise control over a single atom starting at one place with a position, then spreading out so as not to have a position, and finally ending with a single position again," Styer said. So where is all this leading? The Bonn scientists hope eventually it could help simulate complex quantum systems. Plant photosynthesis, for example, is a phenomenon that's hard to capture with modern supercomputers, but small quantum systems based on technology like this could be just what's needed. Then, too, there are the possibilities for quantum computing. "Now that we have gained control of single neutral atoms trapped in laser fields, we would like to use atoms to perform a novel kind of information processing -- namely, the so-called quantum information processing," Alberti explained. "In essence, our atoms behave as a quantum bit, a qubit," he noted. "Each atom can encode information in its spin state, up and down, but all possible superpositions of these two states are possible, exactly as we could split the atom at far apart locations." Computational speeds could be increased enormously as a result, Alberti added. An Exciting Era In some ways, however, the experiment's practical applications are almost less important, Styer opined. "Perhaps it can be used for precision measurements, perhaps it can be used to help build a quantal computer, or perhaps it will prove useful for nothing," he concluded. "But regardless of potential applications, it is great to be alive during an era when our understanding and control of nature is becoming so subtle and nuanced."
<urn:uuid:2930bf7e-2d21-4da5-84ba-37ed5eaf4f8c>
CC-MAIN-2014-49
http://www.technewsworld.com/story/Scientists-Split-Atom-Then-Put-It-Back-Together-75370.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400376197.4/warc/CC-MAIN-20141119123256-00068-ip-10-235-23-156.ec2.internal.warc.gz
en
0.959954
1,671
3.71875
4
Chips measure electron spin Technology Research News Practical quantum computers are at least a decade away, and some researchers are betting that they will never be built. This is because controlling individual particles like atoms, electrons and photons is extraordinarily challenging. Information carried in particles always comes in shades of gray and can be corrupted or wiped out by the slightest wisp of energy from the environment. A pair of experiments has brightened prospects for quantum computing, however, by making it more likely that a practical means of reading electron-based quantum bits, or qubits, can be developed. Research teams from the University of California at Los Angeles and from Delft University of Technology in the Netherlands have developed electronic methods of detecting the spins of individual electrons. Spin is a property of electrons that is akin to the rotation of a top. The two spin directions, spin up and spin down, are magnetically opposite, like the two poles of a kitchen magnet. The spins can represent the 1s and 0s and digital information. Particles that are isolated from their environment are in the weird quantum state of superposition, meaning they are in some mix of the two spin directions. This means a qubit can be in some mix of 1 and 0, which allows a string of qubits to represent every binary number at once. This gives a quantum computer the ability to check every possible answer to a problem with a single set of operations, promising speedy solutions to problems that classical computers have to churn through one answer at a time. These include factoring large numbers, a problem whose difficulty is the foundation of most of today's security codes. Electronic equipment has become sensitive enough that it is no longer difficult to detect the presence of a single electron. But detecting an electron's spin orientation is another matter. In recent years, researchers have succeeded in detecting electron spin optically using specialized laser setups. The key to using electron spin in quantum computers whose architecture is similar to today's computer chips is being able to detect the spin orientation electronically. The UCLA team's method of electron spin detection uses devices that are already mass-produced. The researchers flipped a single electron spin in a commercial transistor chip, and detected the spin flip by measuring changes in current flowing through the device. Several proposed quantum computer architectures call for circuits that can be manufactured using today's chipmaking techniques. "The transistor structure used for our experiment [closely] resembles some proposed spin-based qubit architectures," said Hong-Wen Jiang, a professor of physics at the University of California at Los Angeles. "We believe that our read-out scheme can be readily adapted in a scalable quantum information processor," he said. Electrons travel through a transistor via a semiconductor channel that is electrically insulated. The transistor is controlled by a gate electrode, which produces an electric field that penetrates the insulator and increases the conductivity of the channel, allowing electrons to flow. Occasionally defects occur, producing one or more spots in the insulator that can draw individual electrons from the channel and trap them. The researchers sought out transistors that contained single defect traps, set the gate voltage so that the trap had an equal chance of attracting an electron or not, and applied a large magnetic field to the trap. A high magnetic field causes electrons in the spin-down state to have slightly more energy than spin-up electrons. The researchers flipped the electron's spin with a microwave pulse. An electron that is spin-up fills the trap but a higher-energy spin-down electron leaves room, electrically speaking, for a second, spin-up electron from the channel to join it in the trap. The difference between having one and having two electrons in the trap is measurable as a change in the current flowing through the transistor. Two electrons decrease the amount of current. The researchers can observe a microwave pulse flipping the spin of an electron in the trap by measuring the current. In its present form, the UCLA device uses a randomly-positioned defect as its electron trap, and electrons cycle through the trap rapidly enough that the spin measurement is an average of a few thousand electrons. The researchers are conducting similar experiments in specially designed semiconductor structures that promise greater control over electron spin, the ability to entangle two spins, and to eventually build a scalable quantum processor, said Jiang. Properties of entangled particles, including spin, remain in lockstep regardless of the distance between them. Entanglement is a basic requirement of quantum algorithms, and entangled electrons would enable information to be teleported between circuits within a quantum computer. Meanwhile, the Delft team devised a way to measure the spin of an electron trapped in a quantum dot -- a tiny semiconductor device that produces electric fields capable of confining one or a few electrons. "The technique works fully electrically, and is therefore... suitable for integration with existing solid-state technologies," said Jeroen Elzerman, a researcher at Delft University of Technology. The researchers applied a large magnetic field to the trapped electron, which caused the spin-down state to have slightly more energy than the spin-up state. They tuned the quantum dot's electric field so that the energy of a spin-down electron was just high enough for it to escape, but the energy of a spin-up electron was below the threshold. Therefore, if an electron is present it is spin-up, and if the quantum dot is empty, the electron that escapes is spin-down. The researchers next step is to to use pulsed microwaves to control the exact quantum superposition of the spin, said Elzerman. They then plan to entangle two spins. "When this is done, all the basic ingredients for a quantum computer are in place," he said. Coupling many spins and controlling their interactions accurately enough to perform a quantum algorithm is a matter of improving control over the fabrication process, said Elzerman. "We need cleaner and purer materials and more reproducible electron beam lithography so that all dots on a single chip are really identical," he said. Jiang's research colleagues were Ming Xiao and Eli Yablonovitch of UCLA, and Ivar Martin of Los Alamos National Laboratory. They published the research in the July 22, 2004 issue of Nature. The research was funded by the Defense Advanced Research Projects Agency (DARPA) and the Defense Microelectronics Activity (DMEA). Elzerman's research colleagues were Ronald Hanson, Laurens Willems van Beveren, Benoit Witkamp, Lieven Vandersypen and Leo Kouwenhoven. They published the research in the July 22, 2004 issue of Nature. The research was funded by DARPA, the Office of Naval Research, the European Union and the Dutch Organization for Fundamental Research on Matter (FOM). Timeline: 10 years; 10-20 years TRN Categories: Physics; Quantum Computing and Communications Story Type: News Related Elements: Technical papers, "Electrical detection of the spin resonance of a single electron in a silicon field-effect transistor," Nature, July 22, 2004; "Single-shot read-out of an individual electron spin in a quantum dot," Nature, July 22, 2004 August 11/18, 2004 Projector lights radio Cell phone melds video Sound system lets Chips measure electron Twisted fiber filters bring walking to VR Speck trios make Single gold atoms Pen writes micro wires Design eases nano Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:5015ed3a-39de-4506-91d3-92bbfdf293f7>
CC-MAIN-2014-49
http://www.trnmag.com/Stories/2004/081104/Chips_measure_electron_spin_081104.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931009292.37/warc/CC-MAIN-20141125155649-00041-ip-10-235-23-156.ec2.internal.warc.gz
en
0.922861
1,576
3.78125
4
Quantum Computer Passes Math Test, But Doesn’t Answer the Big Question - 12:34 pm | Is the world’s first commercial quantum computer the real deal or not? No one is quite sure. The most recent experiment adding fodder to this debate used the quantum computer made by the Canadian company D-Wave Systems to determine hard-to-calculate solutions in a mathematical field known as Ramsey theory. Despite the machine’s success, many scientists are still skeptical of this quantum computer’s legitimacy. “At the moment, it’s not clear to my eyes that D-Wave device is what we would call a quantum computer,” said computer scientist Wim van Dam from the University of California, Santa Barbara, who was not involved in the recent work. Quantum computers harness the weird quirks of the subatomic world to run algorithms at extremely quick speeds and solve problems that stymie our current electronic devices. That’s because classical computers rely on transistors that hold memory in the form of zeros and ones. A quantum computer, by contrast, uses subatomic particles (called qubits) that can be a one, a zero, or a simultaneous superposition of these two states. Since the early 2000s, researchers have been able to build rudimentary quantum computers but it wasn’t until 2011 that D-Wave announced a commercial product with a 128-qubit processor. If it were truly a quantum computer, it would be leaps and bounds ahead of any other product, but the company’s statements have been met with raised eyebrows from the computer science community. Still, D-Wave sold its first products to companies such as Lockheed Martin while their second-generation device was bought up by Google and NASA. The latest experiment used the D-Wave machine to find solutions to optimization problems in what is known as Ramsey theory, after British mathematician Frank Ramsey. This field deals with situations in which a certain kind of order appears within a disordered system. A well-known problem called the “party problem” asks what the minimum number of guests you would need to invite to a gathering to ensure that a small subset is made of people who all know each other and another who all don’t. Solutions to this problem are given in what’s known as Ramsey numbers. Calculating the minimum number of guests to ensure groups of three strangers and three friends is fairly easy (the answer is six). But increasing the number of people makes the solution increasingly hard to calculate, with most Ramsey numbers being beyond the capability of our current computers. D-Wave’s device was able to implement an algorithm to calculate Ramsey numbers for different configurations, though none that weren’t already known from previous work. The findings appeared Sept. 25 in Physical Review Letters. While noting that the D-Wave experiment’s calculations were correct, the authors of a commentary piece in the same issue wrote that “many more tests would be needed to conclude that the logical elements are functioning as qubits and that the device is a real quantum computer.” Graeme Smith and John Smolin from IBM’s Watson Research Center, the authors of the commentary, question just how coherent the qubits of D-Wave’s computer are. Coherence refers to how long the particles are able to remain in a state of superposition (where they are both zero and one simultaneously), which is notoriously tricky to maintain. Even small amounts of noise can cause the qubits’ quantum mechanical wavefunction to collapse, turning them into classical objects that don’t work like a true quantum computer. But the algorithms used to calculate these Ramsey numbers “don’t need as much coherence as a full-blown quantum computer,” said physicist Frank Gaitan of the University of Maryland, who worked on the D-Wave experiment. Gaitan adds that D-Wave’s machine is not necessarily a universal quantum computer, which could run any algorithm given to it. Instead, it is designed to be particularly good at solving optimization problems, such as those in Ramsey theory, and the evidence from his research shows that the device “uses some kind of quantum effect that solves some kind of problems.” Even then, there is still some question as to whether D-Wave’s system is truly a quantum computer. Van Dam noted that Ramsey number problems aren’t a good choice for proving anything about quantum computers. That’s because “it’s a really easy problem,” he said. He gave an analogy. Imagine a company says they built a self-driving car and then placed it on top of a hill. They start the car and it rolls to the bottom of the hill. You could say the car drove itself down or you could say it was carried downhill by gravity, and it might be hard to determine which one it is. Gaitan hopes that future work will help clear up these problems. The current generation of D-Wave’s system can’t calculate any unknown Ramsey numbers. But their third-gen device, expected to come out in 2015, should have 2048 qubits, which might be enough to figure out new Ramsey numbers that are beyond the capability of current computers.
<urn:uuid:d06badd8-c466-4080-8dc5-9f0ba5dd9ddb>
CC-MAIN-2014-49
http://www.wired.com/2013/10/quantum-computer-ramsey/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931009968.66/warc/CC-MAIN-20141125155649-00024-ip-10-235-23-156.ec2.internal.warc.gz
en
0.953976
1,095
3.640625
4
Simple optics make quantum relay Technology Research News If it weren't for repeaters, the light pulses that carry information over fiber-optic long distance lines would fade before they got much further than 100 kilometers. Quantum cryptography devices and networks, which transport photons whose properties can be used to represent the 1s and 0s of digital information, could also benefit from repeaters. Today's prototype quantum cryptography systems provide theoretically perfect security, but these systems can't carry information over long distances. Researchers from the NASA-Caltech Jet Propulsion Laboratory have found a way to make a quantum repeater using ordinary optical equipment. Practical quantum repeaters could boost the reach of quantum cryptography systems, and eventually enable quantum networks. The device would allow for an exponential improvement in the distance quantum bits can be transmitted, said Jonathan Dowling, a principal scientist at at the Jet Propulsion Laboratory. The challenge was finding a way to preserve entanglement. Particle properties like polarization can become entangled when two or more particles come into contact with each other or simultaneously interact with a third entity like another particle or a laser beam. Entanglement keeps properties like polarization linked, regardless of the distance between entangled particles. A photon's electric field can be polarized, or oriented, in one of four directions. Pairs of directions can represent binary numbers. Entanglement is the basic ingredient of many quantum computing, quantum cryptography and quantum communications schemes. Sharing entangled particles between locations makes theoretically perfectly secure communications possible because the traits of a series of particles can form a random string of bits that can be used to encrypt messages. It is impossible for an eavesdropper to copy or intercept the particles without disrupting the entanglement, which would reveal the security breach. Shared entanglement would also make it possible to network quantum computers. "Many quantum communication protocols rely on shared entanglement between two distant parties," said Pieter Kok, one of the Jet Propulsion Laboratory researchers who is now at Hewlett-Packard Laboratories. But because photons must be in the same place when they are initially entangled, using entangled particles for communication means finding a way to transport them, he said. This is difficult because particles can't be copied without destroying their quantum information, which means ordinary repeaters, which produce copies of fading signals, can't be used for quantum communications. The researchers' linear optical quantum repeater uses optical elements like mirrors, beam splitters and photodetectors to purify and transfer entanglement among photon pairs. Entanglement purification makes two or more partially entangled states into one fully entangled state. Entanglement swapping converts entanglement: entanglements between particles A and B and particles C and D can be converted to an entanglement between A and D. Beam splitters direct photons in one of two directions based on the photons' polarization, and photodetectors at each output of a beam splitter determine a photon's polarization. The repeater is made up of a network of beam splitters and photodetectors that route photons based on whether specific photodetectors detect other photons. The combination of the right paths and detection-triggered routing is enough to carry out entanglement purification and swapping. To use the system to initiate quantum communications, a sender, Alice, would entangle photons A and B, keep A, and send B to a receiver, Bob. A repeater in the network between Alice and Bob would generate a new pair of entangled photons, C and D, and bring together B and C. This would destroy B and C and in the process leave A entangled with D. The device would then send photon D on to Bob, giving Alice and Bob a shared pair of entangled photons. Rather than copying photons, the quantum repeater transfers entanglement. In practice, there are degrees of entanglement, and in order to transmit entangled states of high enough purity, quantum communications schemes typically distill multiple entangled pairs down to a single pair of fully entangled photons. In the researchers' repeater, the purification step takes place before the entanglement swapping. The linear optical quantum repeater was inspired by the landmark theoretical demonstration of linear optical quantum computing by Emanuel Knill, Raymond Laflamme and Gerard Milburn in 2001, said Dowling. "Since a repeater is just a very simple type of quantum computer, logic dictated it would be possible, but the devil was in the details," he said. Other research teams have devised quantum repeaters that tap the interactions of photons with gas atoms. In these schemes, fading photons that enter a repeater transfer their quantum states to atoms, which can briefly store the state information until it can be transferred to fresh photons that are transmitted over the next leg of the network. Light-matter interactions are difficult to carry out, however, especially with equipment that could be used in practical communications networks. A third approach uses nonlinear optical quantum repeaters that use complicated equipment to cause photons to interact with each other; these may be harder to make than the linear design, said Dowling. The researchers' goal is to develop simple devices that prove the utility of their linear optical approach, and eventually use the approach to build a full-scale quantum computer, said Dowling. A reliable source of entangled photons is a top priority, said Kok. "It not only has to be able to make high-quality entanglement, it also needs to do this reproducibly," he said. "Two sources must produce almost indistinguishable photon pairs in order for the interference to work." Another key component that needs to be developed is quantum memory so that, for instance, Alice can hold onto her half of the original entangled photon pair. And the system eventually has to be miniaturized into a quantum optoelectronic chip, according to Dowling. Such systems could eventually be used in quantum cryptography systems, for quantum telecommunications, and for distributed quantum computing, said Dowling. It will be 20 years before the method can be used practically, Dowling and Kok 's research colleague was Colin P. Williams. The work appeared in the August 1, 2003 issue of Physical Review A. The research was funded by the National Aeronautics and Space Administration (NASA), and The Advanced Research and Development Activity (ARDA), the National Security Agency (NSA), the Office of Naval Research (ONR), and the Defense Advanced Research Projects Agency (DARPA). Timeline: 20 years TRN Categories: Quantum Computing and Communications; Physics; Cryptography and Security; Optical Computing, Optoelectronics and Photonics Story Type: News Related Elements: Technical paper, "Construction of a Quantum Repeater with Linear Optics," Physical Review A, August 1, 2003. February 12, 2004 Ethanol yields hydrogen Biochip makes droplet Model keeps virtual Simple optics make Hot tip boosts Nanowires spot DNA up object orientation makes liquid crystal Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:3dabec11-a52e-440f-ac07-406b787d61a6>
CC-MAIN-2014-49
http://www.trnmag.com/Stories/2004/022504/Simple_optics_make_quantum_relay_022504.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400380464.40/warc/CC-MAIN-20141119123300-00247-ip-10-235-23-156.ec2.internal.warc.gz
en
0.903418
1,474
4.09375
4
Tiny device is first complete 'quantum computer' Aug 11, 2009 Researchers in the US claim to have demonstrated the first small-scale device to perform all the functions required in large-scale ion-based quantum processing. Although the individual stages or groups of stages in quantum computing have been demonstrated previously, this new device is said to perform a complete set of quantum logic operations without significant amounts of information being lost in transit. As a result, the device represents an important step in the quest for a practical quantum computer, say the researchers based at the US National Institute of Standards and Technology (NIST) in Boulder, Colorado. Researchers in the field have already hailed this as an important breakthrough in quantum computing. However, they also warn of the practical challenges that still lie ahead if we are to develop large-scale quantum computers. Where conventional computers store data as “bits” with value 1 or 0, in quantum computing data is stored as “qubits” which can hold more than one value at the same time. The upshot of this phenomenon, known as superposition, is that quantum computers could potentially store and process unprecedented amounts of data. What’s more, quantum particles can become “entangled”, allowing them to share a much closer relationship than classical mechanics allows in which data is transferred instantaneously between entangled particles regardless of their separation distance. The quantum path The concept of quantum computing gathered significant momentum in 1994 when the mathematician Peter Shor invented an algorithm to show that quantum computation could factor numbers significantly faster than in classical computation. The implication was that quantum computers could operate at ultra-high speeds, which could be applied to solving complex problems like cracking some of today’s most widely used encryption codes. However, it quickly became apparent that researchers would have a very difficult task of putting this into practice due to the delicate nature of quantum information, particularly when quantum data is being transferred between locations. Despite this limitation, some simple quantum algorithms have been executed in the past few years. Perhaps most notable was the first and only demonstration of Shor's factoring algorithm, using nuclear magnetic resonance, by Lieven Vandersypen and his colleagues at the IBM Almaden Research Center in California. "Home and his team have shown the individual pieces of the puzzle to work separately in a series of beautiful experiments in recent years. Now, in this tour-de-force, they put the pieces of the puzzle together and made them all work in one experiment," Boris Blinov, University of Washington One promising approach to realizing quantum algorithms is the storage and transfer of quantum data in ultracold ions. This is the approach taken by the group at NIST, led by Jonathan Home, which, over the past few years, has demonstrated all of the steps needed for quantum computation: (1) "initialize" qubits to the desired starting state (0 or 1); (2) store qubit data in ions; (3) perform logic operations on one or two qubits; (4) transfer information between different locations in the processor; and (5) read out qubit results individually. Caught in a trap In this latest research, Home’s group have now managed to combine all of these separate stages for the first time. The team held two beryllium atoms in a trap before manipulating the energy states of each ion using an applied ultraviolet laser pulse in order to store quantum data. Electric fields were then used to move the ions across macroscopic distances — up to 960 micrometres — between different zones in the trap. The researchers repeated a sequence of 15 logical operations 3,150 times on each of 16 different starting states and found that the processor worked with an overall accuracy of 94 per cent. One of the key innovations employed by the NIST researchers was to use two partner magnesium ions as “refrigerants” for cooling the beryllium ions as they are being transported. This “sympathetic cooling” enabled logic operations to continue without any additional error due to heating incurred during transport. “We have incorporated transport, and explicitly shown that it does not impede our ability to do further computation — this is a crucial step for building a large-scale device,” Home told physicsworld.com. Early response to this development from the research community is positive. “Home and his team have shown the individual pieces of the puzzle to work separately in a series of beautiful experiments in recent years. Now, in this tour-de-force, they put the pieces of the puzzle together and made them all work in one experiment,” said Boris Blinov, a quantum computing researcher at the University of Washington. The road ahead Hans Bachor, a quantum optics specialist at the Australian National University is also impressed. “The work is indeed a great step forward and most impressive — it demonstrates all the key steps required in the computing cycle.” Bachor, however, also warns of technical challenges that lie ahead. “The question is whether they can keep the ion in the ground state. I am not aware of any in principle problems, but it will require more tricks to invented,” he added. Home told physicsworld.com that his team are continuing to develop their trapped ion system with a focus on two specific problems. The first area is to improve the logic operation accuracy: the accuracies required for a large scale device are 0.9999, where the accuracy in this device is 0.95. “Here we are limited by the control we have over our laser beams, and the power of these beams,” he said. The second area is to build larger devices. “Crosstalk between different parts of the processor may be a problem which only exists in larger devices. The classical computer control, and the need for precision control of large numbers of electrodes and laser beams, represents a major technical challenge,” he said. Markus Aspelmeyer, a quantum optics researcher at the University of Vienna recognizes another of the challenges involved in scaling up. “It will be a challenge to minimize the individual gate errors and to gain control over a large number of ions on a single chip,” he said. Adding, “This is however essential to perform lengthy calculations on a future quantum computer. It is an exciting challenge to both engineering and quantum information science and it is not clear yet where the exact limitations will be.” This research was reported in Science Express. About the author James Dacey is a reporter for physicsworld.com
<urn:uuid:28d73219-d225-46e9-9bde-a59999849bf6>
CC-MAIN-2014-49
http://physicsworld.com/cws/article/news/2009/aug/11/tiny-device-is-first-complete-quantum-computer
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931011060.35/warc/CC-MAIN-20141125155651-00009-ip-10-235-23-156.ec2.internal.warc.gz
en
0.950995
1,354
3.546875
4
Everywhere in a Flash: The Quantum Physics of Photosynthesis - 2:40 pm | By hitting single molecules with quadrillionth-of-a-second laser pulses, scientists have revealed the quantum physics underlying photosynthesis, the process used by plants and bacteria to capture light’s energy at efficiencies unapproached by human engineers. The quantum wizardry appears to occur in each of a photosynthetic cell’s millions of antenna proteins. These route energy from electrons spinning in photon-sensitive molecules to nearby reaction-center proteins, which convert it to cell-driving charges. Almost no energy is lost in between. That’s because it exists in multiple places at once, and always finds the shortest path. “The analogy I like is if you have three ways of driving home through rush hour traffic. On any given day, you take only one. You don’t know if the other routes would be quicker or slower. But in quantum mechanics, you can take all three of these routes simultaneously. You don’t specify where you are until you arrive, so you always choose the quickest route,” said Greg Scholes, a University of Toronto biophysicist. Scholes’ findings, published Wednesday in Nature, are the strongest evidence yet for coherence — the technical name for multiple-state existence — in photosynthesis. Two years ago, researchers led by then-University of California at Berkeley chemist Greg Engel found coherence in the antenna proteins of green sulfur bacteria. But their observations were made at temperatures below minus 300 degrees Fahrenheit, useful for slowing ultrafast quantum activities but leaving open the question of whether coherence operates in everyday conditions. The Nature findings, made at room temperature in common marine algae, show that it does. Moreover, similar results from an experiment on another, simpler light-harvesting structure, announced by Engel’s group last Thursday on the pre-publication online arXiv, suggest that photosynthetic coherence is routine. The findings are wondrous in themselves, adding a new dimension to something taught — incompletely, it now seems — to every high school biology student. They also have important implications for designers of solar cells and computers, who could benefit from quantum physics conducted in nonfrigid conditions. “There’s every reason to believe this is a general phenomenon,” said Engel, now at the University of Chicago. He called Scholes’ finding “an extraordinary result” that “shows us a new way to use quantum effects at high temperatures.” Scholes’ team experimented on an antenna protein called PC645, already imaged at the atomic scale in earlier studies. That precise characterization allowed them to target molecules with laser pulses lasting for one-quadrillionth of a second, or just long enough to set single electrons spinning. By analyzing changes to a laser beam sent through the protein immediately afterwards, the researchers were able to extrapolate what was happening inside — an ultra-high-tech version of shadows on a screen. They found that energy patterns in distant molecules fluctuated in ways that betrayed a connection to each other, something only possible through quantum coherence. “It’s the same as when you hit two tuning forks at the same time, and hear a low-pitched oscillation in the background. That’s the interference of sound waves from the forks. That’s exactly what we see,” said Scholes. According to Scholes, the physics of photosynthetic proteins will be further studied and used to improve solar cell design. Engel suggested their use in long-promised but still-unworkable quantum computing. “This allows us to think about photosynthesis as non-unitary quantum computation,” he said. Quantum-physical processes have been observed elsewhere in the biological realm, most notably in compass cells that allow birds to navigate by Earth’s geomagnetic fields. Researchers have also proposed roles for quantum physics in the animal sense of smell and even in the brain. Engel predicts the emergence of an entire field of quantum biology. “There are going to be some surprises,” said Scholes. “Who knows what else there is to discover?” Images: 1. Bùi Linh Ngân/Flickr 2. Antenna protein: Light-harvesting molecules are red./Greg Scholes 3. Graph of energy wave interference inside the antenna protein/Nature - Reverse-Engineering the Quantum Compass of Birds - Quantum Entanglement Visible to the Naked Eye - “Sudden Death” Threatens Quantum Computing - Green Sea Slug Is Part Animal, Part Plant Citations: “Coherently wired light-harvesting in photosynthetic marine algae at ambient temperature.” By Elisabetta Collini, Cathy Y. Wong, Krystyna E. Wilk, Paul M. G. Curmi, Paul Brumer & Gregory D. Scholes. Nature, Vol. 463 No. 7281, Feb. 4, 2010. “Long-lived quantum coherence in photosynthetic complexes at physiological temperature.” By Gitt Panitchayangkoon, Dugan Hayes, Kelly A. Fransted, Justin R. Caram, Elad Harel, Jianzhong Wen, Robert E. Blankenship, Gregory S. Engel. arXiv, Jan. 28, 2010.
<urn:uuid:e4cc9021-b91d-4608-a019-456b6c7e39fa>
CC-MAIN-2014-49
http://www.wired.com/2010/02/quantum-photosynthesis/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400379636.59/warc/CC-MAIN-20141119123259-00111-ip-10-235-23-156.ec2.internal.warc.gz
en
0.907559
1,143
3.59375
4
In life, most people try to avoid entanglement, be it with unsavory characters or alarmingly large balls of twine. In the quantum world, entanglement is a necessary step for the super-fast quantum computers of the future. According to a study published by Nature today, physicists have successfully entangled 10 billion quantum bits, otherwise known qubits. But the most significant part of the research is where the entanglement happened–in silicon–because, given that most of modern-day computing is forged in the smithy of silicon technology, this means that researchers may have an easier time incorporating quantum computers into our current gadgets. Quantum entanglement occurs when the quantum state of one particle is linked to the quantum state of another particle, so that you can’t measure one particle without also influencing the other. With this particular study, led by John Morton at the University of Oxford, UK, the researchers aligned the spins of electrons and phosphorus nuclei–that is, the particles were entangled. “The key to generating entanglement was to first align all the spins by using high magnetic fields and low temperatures,” said Oxford’s Stephanie Simmons, who also worked on the team…. “Once this has been achieved, the spins can be made to interact with each other using carefully timed microwave and radiofrequency pulses in order to create the entanglement, and then prove that it has been made.” [Reuters] If the current entanglement experiment were a cooking recipe, it would go something like this: First, embed a silicon crystal with 10 billion phosphorous atoms, cool it to close to absolute zero, and then apply a sequence of radio and microwave pulses. These pulses essentially toy with the spins of the phosphorus nuclei and their electrons until the spin of each nucleus matched the spin of one of its electrons. You end up with 10 billion entangled pairs that form a two-qubit system. It’s a major breakthrough, but the researchers aren’t stopping there: “Creating 10 billion entangled pairs in silicon with high fidelity is an important step forward for us,” said John Morton of Britain’s Oxford University, who led the team…. We now need to deal with the challenge of coupling these pairs together to build a scalable quantum computer in silicon.” [Reuters] Spinning particles are all well and nice, but what do they have to do with computing? How does a quantum computer actually compute? To turn this into a silicon quantum computer, the team must create a “huge 2D grid of entanglement”, in which nuclei are entangled with other phosphorus nuclei, as well as electrons, says Morton. To achieve this, electrons will be shuttled through the structure, stitching entangled states together like a thread, he says. By measuring the electron spins in a certain order, computations could be performed. [New Scientist] Such a quantum computer would run silicon circles around conventional ones. Unlike the device sitting on your desk, quantum computers aren’t limited by the 0′s and 1′s of binary bits. In the weird world of quantum mechanics, particles can exist in more that one state at a time–they can be placed in a “superposition” of several possible states. That means that the qubits in a quantum computer could hold several different values simultaneously. It has been shown theoretically that by running calculations in parallel, using many quantum states in superposition, a quantum computer could solve problems that would take a classical computer an infinite amount of time, for example, running Shor’s algorithm, which factors large numbers into primes and could be used, for example, to crack the most powerful encryption algorithms on the Internet. [Nature News] In short, a quantum computer would generate a computing power the likes of which the world has never seen, capable of running–as well as cracking–evermore complex algorithms. While impressed by the quantum leaps made by this research, scientists are already considering the next hurdles in the quantum computing story. “It’s nice, impressive work,” says Jeremy O’Brien, a quantum-computing specialist at the University of Bristol, UK. But what is really needed, he says, is the ability to do the additional nanofabrication to put electrodes on the silicon chip to address each individual nucleus and electron pair, a technology that will be needed to get more than two spins entangled together in silicon. “That would be really impressive,” he says. [Nature News] Even though quantum computers have a ways to go before they wind up in your living room and in your every-day gadgets, thanks to successful silicon entanglement that day is getting closer. 80beats: Can Physicists Make Quantum Entanglement Visible to the Naked Eye? 80beats: Tiny LEDs Pump out Quantum-Entangled Photons Science Not Fiction: Quantum Quest – Potentially Awesome? DISCOVER: Computers Go Quantum DISCOVER: Quantum Leap Image: Stephanie Simmons
<urn:uuid:e4f76091-a12f-4fbc-951f-136ea088460a>
CC-MAIN-2014-49
http://blogs.discovermagazine.com/80beats/2011/01/19/a-step-towards-quantum-computing-entangling-10-billion-particles/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931009968.66/warc/CC-MAIN-20141125155649-00034-ip-10-235-23-156.ec2.internal.warc.gz
en
0.925155
1,055
3.71875
4
101010: That's the number 42 represented in binary, which is the mathematical way today's binary computers see every single piece of information flowing through them, whether it's a stock price, the latest Adele track, or a calculation to generate an MRI of a tumor. But now IBM believes it's made progress in developing quantum computers, which don't use binary coding. It is not overstating the matter to say this really may be the ultimate answer in computing machines. Quick, mop your brow and don't worry: The science isn't too hard to grasp and the revolution, when it comes, could rock the world. In a very good way. First, a little background: Computers today, everything from the chip controlling your washing machine cycle to the screen you're reading this on, rely on binary math to work. This reduces the information in problems you ask a computer to a counting system based on just "1"s and "0"s. That translates beautifully into the electronics of a computer circuit: A "1" matches up with a little burst of electricity, a "0" means none. By shuttling trillions upon trillions of these pulses, called bits, through tiny silicon circuits and transistor gates that flip their direction or trigger an ongoing signal, the chip does math with these ones and zeros. It's a mind-bogglingly complex and very swift dance that ultimately results in Angry Birds playing on the screen of your iPad. Or, after kajillions of calculations more in a supercomputer, it results in a model predicting climate change. Now, what if instead of simply being able to do math with ones and zeros, a computer chip could work with bits that included other numbers? You'd have to design more complex circuitry, for sure, but it means every single one of those tiny electronic calculations that's happening every millisecond could tackle more information at once, and would ultimately mean a more powerful computer that may calculate faster. Got that? Good. Now how about if instead of a one or a zero, your computer's "bits" could have any one of an infinite number of values? That's quantum computing. Essentially this moves way beyond the well-known physics of electronics, and on into the weird and wonderful world of quantum physics—where bizarre twists of the laws of the universe mean a "bit" in a quantum computer could hold both a "1" and a "0" and any other value at the same time. That means the circuits of a quantum computer could carry out an incredibly huge number of calculations at the same time, handling more information at once than you can possibly imagine. By using some other very strange physics (superconducting materials cooled to hundreds of degrees below freezing) IBM's research team is trying to build some of the core components of a quantum computer, and has made big progress. They're now saying they've made the quantum "bits" of information, also called qubits, live a lot longer before they essentially get scrambled. They've also worked out how to speed up the actual quantum computing circuit. IBM's progress is so impressive that they're now confident a quantum computer could be made sooner rather than later, perhaps as close as 15 years away. Whenever it arrives, the world will change. On a very simple level, this is because instead of asking a supercomputer to work with endless strings of "1"s and "0"s to calculate all the variables in, say, a global warming simulation (performing trillions of small math calculations one after the other to work out the dynamics of the climate over a period of hours or days) a quantum computer would be able to process much of the math at the same instant instead of sequentially. Which could reduce the compute time to a second or less. Which ultimately means better and more accurate models of the climate. Similar processing tricks could improve medical imaging, or maybe even simulations of your own particular disease's spread, which may improve treatment. And there are many ways this tech would touch your life on an everyday basis, as well. Tasks like image recognition in Google Goggles or voice recognition in Apple's Siri rely on whisking your data off to a powerful computer, running it through a process, and sending you the results back (identifying that photo of a building as the Eiffel tower, or answering your question about the rain in Spain). These recognition problems are partly based on how good the recognition algorithm is, but also on how much time the computer can afford to spend on your problem. A quantum computer would work so swiftly that there would be no issues with spending more time trying to accurately understand your query, meaning we could reach near-perfect image and voice recognition. Perhaps even in real time, from a video feed. Imagine the sort of augmented reality tech that that would enable, with a head-up display on your view of the world constantly delivering relevant info about everything you see. Then think about security—most encryption systems nowadays rely on clever math that means they couldn't be cracked even by a supercomputer running for years. A quantum computer could try every single combination of passwords to crack the security in a single second, which is pretty terrible news. That's going to force all sorts of changes with how we protect information, and yet it could also lead to more secure encryption, made by a quantum computer. There's also the matter of surveillance: Recognizing every word of every phone conversation on the planet and identifying every single face on every CCTV image would defeat all of today's supercomputer power...but maybe a quantum computer could do it. George Orwell would've loved that. Also on the dark side, ponder how insurance firms would use or abuse this phenomenal power ("our simulation says it's 75% more plausible the accident was your fault"), or how worried nations could simulate social dynamics to try to predict crime. Next, on the lighter side, consider art. Or at least the movies. Look at computer graphics in films: The computers in render farms that companies like Pixar use to make Brave take hours to put together a single frame, and that limits how truly amazing the image can be made. A quantum computer could tackle a render of today's Pixar movies in a blink of an eye. And that has all sorts of implications, maybe meaning CGI actors could be even more realistic. Which leads on to artificial intelligence—a sci-fi promise that's so far been very difficult to make real, although IBM's Watson has recently wowed everyone. What if quantum computing suddenly enabled such swift, complex calculations that a system like Watson or Siri could talk back to you convincingly, reading the nuances in your voice enough to ask, as a friend might, if you're a little stressed today and wondering if they could help? Quantum computers won't necessarily be able to speed up solving every class of problem you throw at them, but it's undeniable that they'll change modern life in many ways, at times small, at others great. As for questions on life, the universe, and everything? Those still require the human element to try to answer.
<urn:uuid:37bc639f-1d93-4c05-8764-b3cc47b7d2d7>
CC-MAIN-2014-49
http://www.fastcompany.com/1821378/ibms-quantum-computers-could-change-world-mostly-very-good-ways
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400372819.5/warc/CC-MAIN-20141119123252-00019-ip-10-235-23-156.ec2.internal.warc.gz
en
0.960941
1,443
3.578125
4
Introduced in Alan Turing 's 1936 paper On computable numbers, with an application to the Entscheidungsproblem , a universal Turing machine is a mathematical idealisation of a general purpose computer . Able to act, with appropriate input, as literally any other possible Turing Machine , Turing's invention, essentially the concept of a general purpose cpu executing a stored program , was probably the largest single step taken in the development of the computer, and is often regarded as the start of computer science A Turing machine (TM) consists of a tape, a head which can mark and erase the tape, and a set of states. Depending on whether the tape is currently marked, and which state is occupied, the TM will erase or mark the tape or not, and move it one square left or right, at which point the next state kicks in. Additionally, there is a state which causes the TM to halt, if it is reached. The tape is considered to be of arbitrary length and composed of discrete units which are accessible to the head in strict order, singly and wholly - that is the tape is an idealised one-bit erasable paper tape which never stretches, breaks, folds, runs out, or breaks other rules which are harder to think of. The critical thing is that though the tape may be arbitrarily large, each step of the operation of a TM is completely determined by a finite number of simple and unambiguous rules. It is completely mechanical in its operation, and always behaves in the same way for any particular state and input. These rules defining a TM (the set of states) can be written out in a standard form as marks on a tape. The interpretation of such an on-tape representation of a TM is then a mechanical procedure which can be realised by some TM with a suitable set of states. A universal Turing machine (UTM) is a particular TM so constructed that its tape can encode any TM whatsoever, with the guarantee that the UTM will then do just what the encoded TM would do. Suppose we have a machine M, then its output with initial tape t can be written M(t). Then a UTM U is a TM such that: for all outputs Mi(tj) there's some ei,j such that U(ei,j) = Mi(tj) We'd call ei,j the encoding of Mi(tj). It's also required that the UTM can recognise input that is not a valid encoding of a TM and produce a predetermined response when this occurs. Turing proved the existence of such UTM's by specifying one in his paper - it turned out not to be very complex - and showing it had the characteristic required, of replicating the behaviour of an arbitrary TM which is encoded on its tape. This is the essence of the modern computer, that given sufficient storage it can carry out an arbitrary program, encoded into some specific "language". The choice of a particular UTM defines a particular language. Turing's insight was that an algorithm, when encoded, is just so much data that can then be operated on by another algorithm. The idea of encoding a TM as input for execution by a UTM is pretty much all you need for the general idea of a computer program. The fact that a UTM can emulate any TM at all makes it easy to establish fundamental equivalences between various computational methods. If a particular method can produce a UTM, then it's obvious it can compute anything computable by an arbitrary TM. Such a formalism or language is said to be Turing complete. Specifications for UTM's have been written in formalisms as diverse as XSLT, sendmail.cf and cellular automata such as Conway's game of life. This property of universality shifts the competition from what can be computed to the number of steps and amount of input required. No matter how featureful, elegant and concise the programming language you construct, whatever computations it can perform can be done in sendmail.cf or brainfuck. Universality has been of interest to some heterodox physicists, such as Ed Fredkin and Steven Wolfram. Fredkin, on a suggestion of Feynman's, has been investigating the possibility of using cellular automata as a physics model and suggests suitable automata must be both universal (i.e. Turing complete) and reversible. Wolfram (also big on CA) sees in the UTM an upper bound to the complexity of the makeup of the universe. David Deutsch has proposed that "every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means", and has attempted to extend the idea of a UTM to quantum computing. Mathematician Gregory Chaitin has used the UTM as a building block in his algorithmic information theory, refining the notion by specifying that the encoding for the TM's must instruct the UTM how long they are (Chaitin says they are 'self-delimiting') and using them to define the algorithmic complexity of a string relative to a given UTM - the length of the shortest input that will cause the UTM to output that string - and to formulate his bizarre constant Omega - the probability, for some self-delimiting UTM, that it will halt with random input. Chaitin imagines flipping a coin to determine the state of each successive bit of the unread tape, as the UTM reads in its program. It's required to be self-delimiting so that the UTM knows when to stop reading and Chaitin knows when to stop flipping coins. Gregory Chaitin, Foundations of Mathematics at: For Fredkin, see:
<urn:uuid:9355ad3b-fd7a-4af1-91c7-ce5dbc7541a4>
CC-MAIN-2014-49
http://everything2.com/title/Universal+Turing+Machine
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400380233.64/warc/CC-MAIN-20141119123300-00078-ip-10-235-23-156.ec2.internal.warc.gz
en
0.937898
1,180
3.875
4
In mathematics, the linking number is a numerical invariant that describes the linking of two closed curves in three-dimensional space. Intuitively, the linking number represents the number of times that each curve winds around the other. The linking number is always an integer, but may be positive or negative depending on the orientation of the two curves. The linking number was introduced by Gauss in the form of the linking integral. It is an important object of study in knot theory, algebraic topology, and differential geometry, and has numerous applications in mathematics and science, including quantum mechanics, electromagnetism, and the study of DNA supercoiling. Any two closed curves in space, if allowed to pass through themselves but not each other, can be moved into exactly one of the following standard positions. This determines the linking number: |linking number −2||linking number −1||linking number 0| |linking number 1||linking number 2||linking number 3| Each curve may pass through itself during this motion, but the two curves must remain separated throughout. This is formalized as regular homotopy, which further requires that each curve be an immersion, not just any map. However, this added condition does not change the definition of linking number (it does not matter if the curves are required to always be immersions or not), which is an example of an h-principle (homotopy-principle), meaning that geometry reduces to topology. This fact (that the linking number is the only invariant) is most easily proven by placing one circle in standard position, and then showing that linking number is the only invariant of the other circle. In detail: - A single curve is regular homotopic to a standard circle (any knot can be unknotted if the curve is allowed to pass through itself). The fact that it is homotopic is clear, since 3-space is contractible and thus all maps into it are homotopic, though the fact that this can be done through immersions requires some geometric argument. - The complement of a standard circle is homeomorphic to a solid torus with a point removed (this can be seen by interpreting 3-space as the 3-sphere with the point at infinity removed, and the 3-sphere as two solid tori glued along the boundary), or the complement can be analyzed directly. - The fundamental group of 3-space minus a circle is the integers, corresponding to linking number. This can be seen via the Seifert–Van Kampen theorem (either adding the point at infinity to get a solid torus, or adding the circle to get 3-space, allows one to compute the fundamental group of the desired space). - Thus homotopy classes of a curve in 3-space minus a circle are determined by linking number. - It is also true that regular homotopy classes are determined by linking number, which requires additional geometric argument. The total number of positive crossings minus the total number of negative crossings is equal to twice the linking number. That is: where n1, n2, n3, n4 represent the number of crossings of each of the four types. The two sums and are always equal, which leads to the following alternative formula Note that involves only the undercrossings of the blue curve by the red, while involves only the overcrossings. Properties and examples - Any two unlinked curves have linking number zero. However, two curves with linking number zero may still be linked (e.g. the Whitehead link). - Reversing the orientation of either of the curves negates the linking number, while reversing the orientation of both curves leaves it unchanged. - The linking number is chiral: taking the mirror image of link negates the linking number. The convention for positive linking number is based on a right-hand rule. - The winding number of an oriented curve in the x-y plane is equal to its linking number with the z-axis (thinking of the z-axis as a closed curve in the 3-sphere). - More generally, if either of the curves is simple, then the first homology group of its complement is isomorphic to Z. In this case, the linking number is determined by the homology class of the other curve. - In physics, the linking number is an example of a topological quantum number. It is related to quantum entanglement. Gauss's integral definition Pick a point in the unit sphere, v, so that orthogonal projection of the link to the plane perpendicular to v gives a link diagram. Observe that a point (s,t) that goes to v under the Gauss map corresponds to a crossing in the link diagram where is over . Also, a neighborhood of (s,t) is mapped under the Gauss map to a neighborhood of v preserving or reversing orientation depending on the sign of the crossing. Thus in order to compute the linking number of the diagram corresponding to v it suffices to count the signed number of times the Gauss map covers v. Since v is a regular value, this is precisely the degree of the Gauss map (i.e. the signed number of times that the image of Γ covers the sphere). Isotopy invariance of the linking number is automatically obtained as the degree is invariant under homotopic maps. Any other regular value would give the same number, so the linking number doesn't depend on any particular link diagram. This formulation of the linking number of γ1 and γ2 enables an explicit formula as a double line integral, the Gauss linking integral: This integral computes the total signed area of the image of the Gauss map (the integrand being the Jacobian of Γ) and then divides by the area of the sphere (which is 4π). - Just as closed curves can be linked in three dimensions, any two closed manifolds of dimensions m and n may be linked in a Euclidean space of dimension . Any such link has an associated Gauss map, whose degree is a generalization of the linking number. - Any framed knot has a self-linking number obtained by computing the linking number of the knot C with a new curve obtained by slightly moving the points of C along the framing vectors. The self-linking number obtained by moving vertically (along the blackboard framing) is known as Kauffman's self-linking number. - The linking number is defined for two linked circles; given three or more circles, one can define the Milnor invariants, which are a numerical invariant generalizing linking number. - In algebraic topology, the cup product is a far-reaching algebraic generalization of the linking number, with the Massey products being the algebraic analogs for the Milnor invariants. - A linkless embedding of an undirected graph is an embedding into three-dimensional space such that every two cycles have zero linking number. The graphs that have a linkless embedding have a forbidden minor characterization as the graphs with no Petersen family minor. - This is the same labeling used to compute the writhe of a knot, though in this case we only label crossings that involve both curves of the link. - This follows from the Jordan curve theorem if either curve is simple. For example, if the blue curve is simple, then n1 + n3 and n2 + n4 represent the number of times that the red curve crosses in and out of the region bounded by the blue curve. - − (2001), "Writhing number", in Hazewinkel, Michiel, Encyclopedia of Mathematics, Springer, ISBN 978-1-55608-010-4
<urn:uuid:38db4d03-3a88-472f-8685-13dd57567e4a>
CC-MAIN-2014-49
http://en.wikipedia.org/wiki/Linking_number
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400378429.52/warc/CC-MAIN-20141119123258-00033-ip-10-235-23-156.ec2.internal.warc.gz
en
0.900448
1,612
3.578125
4
In the world of computers, silicon is king. The semiconducting element forms regular, near-perfect crystals into which chipmakers can carve the hundreds of millions of features that make the microchips that power the processors. Technological improvements let chipmakers cut the size of those features in half every 18 months-a feat known as Moore’s law, after Intel cofounder Gordon Moore. Today, that size hovers around 180 nanometers (180 billionths of a meter), and researchers expect to push below 50 nanometers within a decade. But that’s about as far as silicon can go: below that quantum physics makes electrons too unruly to stay inside the lines. If computers are to keep up with Moore’s law, they will have to move beyond silicon. After a couple of decades of theorizing, computer scientists, bioengineers and chemists in the mid-1990s began lab experiments seeking alternative materials for future CPUs and memory chips. Today, their research falls into three broad categories: quantum, molecular and biological computing.In the field of quantum computing, researchers seek to harness the quantum effects that will be silicon’s undoing. Scientists succeeded in making rudimentary logic gates out of molecules, atoms and sub-atomic particles such as electrons. And incredibly, other teams have discovered ways to perform simple calculations using DNA strands or microorganisms that group and modify themselves. Molecular Building Blocks In one type of molecular computing (or nanocomputing), joint teams at Hewlett Packard Co. and UCLA sandwich complex organic molecules between metal electrodes coursing through a silicon substrate. The molecules orient themselves on the wires and act as switches. Another team at Rice and Yale universities has identified other molecules with similar properties. Normally, the molecules won’t let electrons pass through to the electrodes, so a quantum property called tunneling, long used in electronics, is manipulated with an electric current to force the electrons through at the proper rate. If researchers can figure out how to lay down billions of these communicating molecules, they’ll be able to build programmable memory and CPU logic that is potentially millions of times more powerful than in today’s computers. Molecular researchers like the HP/UCLA team, however, face a challenge in miniaturizing their current wiring technology-nanowires made from silicon strands-from several hundred to approximately 10 nanometers. Carbon nanotubes are promising substitutes. The rigid pipes make excellent conductors, but scientists must figure out how to wrangle them into the latticework needed for complex circuitry. “We’ve shown that the switching works,” says HP computer architect Philip Kuekes. “But there is still not as good an understanding of the basic mechanism so that an engineer can design with it.” Hewlett Packard and UCLA have jointly patented several techniques for manufacturing of molecular computers, most recently in January of 2002. Although molecular circuits employ some quantum effects, a separate but related community of scientists is exploring the possibilities of quantum computing-computing with atoms and their component parts. It works from the notion that some aspect of a sub-atomic particle-say, the location of an electron’s orbit around a nucleus-can be used to represent the 1s and 0s of computers. As with molecules, these states can be manipulated-programmed, in effect. One approach pursued by members of a national consortium involving Berkeley, Harvard, IBM, MIT and others, involves flipping the direction of a spinning electron to turn switches on or off. By applying electromagnetic radiation in a process called nuclear magnetic resonance (NMR) like that used in medical imaging, researchers can control the spin of the carbon and hydrogen nuclei in chloroform. Alternatively, filters and mirrors show promise for controlling photons’ light as a switching mechanism. Other researchers work with materials such as quantum “dots” (electrons in silicon crystal), and “ion traps” (ionized atoms suspended in an electrical field). Quantum bits (qubits) have an unusual quality that makes them a double-edge sword for computing purposes, though. Due to the lack of determinism inherent in quantum mechanics, qubits can be on or off simultaneously, a phenomenon called superposition. This makes it harder to force qubits into digital lockstep, but it also multiplies exponentially the amount of information groups of qubits can store. It theoretically allows massively parallel computation to solve problems previously thought uncomputable, such as factoring large prime numbers. One implication: today’s encryption techniques depend on the unfeasibility of computing the two multipliers (factors) of certain numbers, so quantum computers may one day be able to crack most encrypted files that exist today. This possibility has given the research a boost from government agencies, including the National Security Agency. To be manufacturable, quantum computers will require billions of such sub-atomic switches working together and interacting with their environments without falling into a disorganized state called decoherence. A quantum state called entanglement-where many atoms are made to behave exactly alike-provides one possible solution. Researchers also hope to fight decoherence by harnessing a phenomenon called interference, that is, the overlapping of quantum particles’ wavelike energy. Getting Down to the Biology In addition to molecular and quantum computing, a third approach, biological computing, relies on living mechanism to perform logic operations. Bioengineers have long understood how to manipulate genes to function as switches that activate other genes. Now they’re using the technique to build rudimentary computer “clocks” and logic gates inside bacteria such as E. coli. Other researchers use genes to prod microorganisms into states that represent information. A team headed by Thomas Knight at the MIT Artificial Intelligence Laboratory genetically manipulates luciferase, an enzyme in luminescent creatures such as fireflies, to generate light that serves as a medium of cell-to-cell communication. One of biological computing’s biggest challenges is calculating with elements that are flawed, unreliable and decentralized. To that end, Knight’s amorphous computing group studies ways to encourage bacteria to organize themselves into parallel-processing computers. “I don’t think of it as likely to be the path to making conventional computers,” Knight says. “It will be the way in which we build the molecular-scale computers.” Molecular computers face similar reliability challenges. At HP, researchers used fault-tolerant algorithms to construct a silicon-based computer called Teramac that worked despite having 220,000 defects. Kuekes, Teramac’s project manager, says the company is now exploring ways to translate what they’ve learned to molecular computing. Farther out on the biological curve is DNA computing, which attempts to exploit the way DNA strands recognize each other and combine into structures that could perform large, compute-intensive calculations in parallel. Few in the biological community expect biocomputers to replace the general-purpose silicon computer. They hope instead to manufacture molecular computers cheaply and efficiently with organisms that can orient themselves into logic circuits or transform vats of chemicals to manufacture other chemicals. Still more exciting possibilities come from the potential of special-purpose biological computers to interact with other biological systems. Miniature computers could be injected into living tissue to reprogram cancer-causing genes, for example, or administer insulin shots. For now, all these applications loom distant on the horizon. But researchers agree that silicon’s days are numbered, and that radical new approaches will be needed to keep computers zooming through the 21st century.
<urn:uuid:35f36239-10b6-47df-b710-f94d99daaadb>
CC-MAIN-2014-49
http://www.technologyreview.com/news/401342/the-future-of-cpus-in-brief/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931004988.25/warc/CC-MAIN-20141125155644-00130-ip-10-235-23-156.ec2.internal.warc.gz
en
0.92071
1,570
4
4
Quantum computers offer the promise of processing information much more efficiently than classical computers. But before quantum computers can be built, scientists must confront several challenges, one of which is quantum computers' vulnerability to their surroundings. Interaction with outside forces would immediately damage a quantum computer's information; this problem is known as "decoherence." One method to coherently process quantum information involves cavity quantum electrodynamics (QED). In this method, scientists use a small cavity to achieve coherent dynamics between an atom and a photon by manipulating an atom's radiation properties with mirrors. Scientists from the California Institute of Technology are among the leaders in cavity QED, and have recently reported an important advance to enable a coherent distribution of quantum information across a network. In their paper published in Physical Review Letters, physicist David Boozer and his colleagues have demonstrated the reversible state transfer of a coherent light pulse to and from the internal state of an atom trapped in an optical cavity. This observation is the first verification of atomic physicist Ignacio Cirac's proposal for the reversible mapping of quantum states between light and matter using cavity QED to provide strong coupling for the atom-photon interaction. “The most significant result of this work is the demonstration of reversibility (i.e., coherence) for the light emission and absorption processes,” Boozer told PhysOrg.com. “The fact that this process is coherent means that it preserves superpositions of quantum states, hence it is a way of mapping quantum information between an atom and light.” In quantum networks, qubits (the information states for quantum computers) can be represented by either atoms or photons. Atoms, which have long coherence times, serve as "stationary" qubits, or nodes of a network, where they are stored and locally manipulated. Photons, on the other hand, serve as "flying" qubits, or quantum channels that connect nodes over long distances. While many single-photon sources have been demonstrated in the past decade, none have been experimentally shown to be reversible until now. “In principle, in a quantum computer there are several logic gates, each of which performs an elementary quantum operation on one or two stationary qubits,” Boozer explained. “The gates are connected together in a network, so that the output of one gate can be transported as a flying qubit to the input of the next gate. Hence, one needs a way to turn stationary qubits into flying qubits and vice-versa, which is what our recent work has demonstrated.” In the Caltech scientists' experiment, a cesium atom is localized within the cavity by a far off-resonant optical trap, where it repeatedly undergoes a series of light absorption and reemission cycles, lasting a total of 360 ms. During each such cycle, the cavity is first illuminated by an incident pulse of coherent light. Whenever the atom-cavity system absorbs this pulse, the quantum state of the light is written onto the internal state of the atom. After a delay of about 300 ns, the atomic state gets mapped back onto an emitted pulse of light, which is allowed to interfere with the source of the original coherent pulse. Observing the resulting interference fringe demonstrates the reversibility of the overall absorption-reemission process. “Our optical cavity has a very small mode volume (the cavity length is only 42 microns), which ensures that the coherent interaction between the atom and light field occurs on a much faster time scale than the decoherence caused by atomic spontaneous emission or cavity leakage,” Boozer explained. “Thus the atom and cavity field can exchange quantum information coherently many times before an incoherent process occurs. This regime is known as strong-coupling in cavity QED.” The scientists explain that the efficiency of the light-to-atom transfer is limited in this scenario by factors such as passive mirror losses, equal transmission coefficients of the cavity mirrors, and the coupling of the atom to both polarization modes of the cavity. With the ability to reversibly transfer a qubit's state from "flying" to "stationary" and back again, the scientists have taken a step toward coherently transferring quantum information across a network, without disruption with the outside world. Still, Boozer and his colleagues look forward to future improvements. “In the present work, the qubit is encoded in the photon-number states of light and in the hyperfine levels of the atom,” he said. “A more robust scheme which we may pursue in the future would be to instead use the polarization degree of freedom of the light, and the magnetic sublevels of the atom. Another future goal will be to increase the efficiency of the state transfer process, for instance by using cavity mirrors with unequal transmissivities and/or even higher reflectivities.” Citation: Boozer, A. D., Boca, A., Miller, R., Northup, T. E., and Kimble, H. J. "Reversible State Transfer between Light and a Single Trapped Atom." Physical Review Letters 98, 193601 (2007). Copyright 2007 PhysOrg.com. All rights reserved. This material may not be published, broadcast, rewritten or redistributed in whole or part without the express written permission of PhysOrg.com. Explore further: Discovery sheds light on nuclear reactor fuel behavior during a severe event
<urn:uuid:613dce59-14a3-4a0c-a5c7-6fd76a055cd7>
CC-MAIN-2014-49
http://phys.org/news99050442.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400376197.4/warc/CC-MAIN-20141119123256-00093-ip-10-235-23-156.ec2.internal.warc.gz
en
0.930831
1,119
4.125
4
Quantum mechanics isn’t what it used to be. Several decades ago it was all about how, at the very small scales of atoms, energy comes in chunks or “quanta”: not continuous, like water, but discrete, like money. Even light is grainy, divided up into little packets of energy called photons. But never mind all that. Today, quantum physicists aren’t really talking about quanta, they’re talking about information. They suspect that at its root quantum mechanics is a theory about what can and can’t be known about the world. The famous uncertainty principle, and the idea that quantum objects might be either here or there, are examples of that idea. It’s not all theory, though. The new view offers potential applications in the form of so-called quantum information technology: ways of storing, transmitting and manipulating information that work using quantum rules rather than the “classical” rules of our everyday world. The most celebrated manifestation of this technology is the quantum computer, which could exploit quantum principles to achieve far greater power than the devices on which I’m writing and you are reading. Although it’s clear to those in the field how quantum computers should work, no one knows how to make one. Scientists have made “toy” quantum computers with just a handful of bits (compared to the billions in your smart phone), and some companies are even starting to offer primitive versions for sale – to the scepticism of some experts. But despite tantalising reports of incremental breakthroughs over the past few years, there’s still no prospect that you’ll have a useful quantum laptop in the coming future. However, scientists in Germany have just reported what could be a significant step forward. They say that the ideal material for a quantum computer could be diamond. Don’t despair – that doesn’t mean they will cost the earth. The very thin films of diamond needed for such devices don’t have to be mined; they can be made artificially from carbon-rich gases such as methane. It’s not exactly cheap, but neither are the methods needed to make semiconductor films for a host of existing electronic devices. Both conventional and quantum computers work by encoding and manipulating information in binary form; as “bits”, represented as zeroes and ones. Florian Dolde at the University of Stuttgart and his colleagues think the ideal elements that will store this information on a quantum computer are individual nitrogen atoms implanted into a diamond film. Nitrogen atoms have one more electron than the carbon atoms in diamond, and this spare electron can exist in two different quantum states thanks to a property called spin. Rather like the poles of a magnet (which are used to store information in magnetic disks and tapes), an electron spin can be considered to point either “up” or “down”. That much has been known for some time, and others have experimented with nitrogen-doped diamond for quantum computing. The advance made by Dolde and colleagues is to show how they can place these spins in nitrogen electrons without having to cool the diamond to very low temperatures. In a spin The reason quantum computers could be so powerful is that a collection of bits could exist in many more different states than the same number of “classical” bits. That’s because quantum particles can exist in two or more different states at the same time – in a so-called superposition of states. So each quantum bit (qubit) can be not just a 1 or a 0 but mixtures of both. As a result, a group of qubits could perform many different calculations at once, rather than having to do them sequentially like an ordinary computer. To enable that, it’s generally thought that the qubits have to be entangled. This means that the quantum state of one of them depends on the states of the others – even though these states aren’t actually assigned until they are measured. In other words, if you entangle a pair of spins that have opposite orientations, and measure one of them as being “up”, the other instantly becomes “down”, no matter how far away it is. Some early quantum theorists, including Einstein, thought this would be impossible, but this entanglement is now a well-established fact. But here’s the rub: like most quantum properties, entanglement seems to be very delicate. Amid all the jostling of other atoms, a pair of entangled particles can lose their special connection so that their states become independent of each other. Sustaining entanglement has tended to mean cooling the particles down close to absolute zero to remove that jostling. But a quantum computer that needs to be so cold won’t ever find much of a market. Dolde and colleagues have shown, however, that two nitrogen atoms trapped in diamond tens of nanometres apart can be kept entangled at room temperature for more than a millisecond (thousandth of a second), which could be long enough to perform quantum calculations. They used microwave photons to nudge the atoms into an entangled state, by firing a beam of nitrogen ions (charged atoms) at a diamond film though a mask with holes about 20 nanometres apart. The case for nitrogen-doped diamond quantum computers is boosted further by a paper from Martin Plenio of the University of Ulm in Germany and his co-workers, who have shown that in theory – no more than that yet – such a system could be used as a “quantum simulator”: a kind of quantum computer that can calculate how other quantum systems will behave. The mathematics needed to predict quantum behaviour is complicated, and ordinary computers struggle to accommodate it. But a quantum simulator, working by quantum rules, already has the “quantum-ness” built in to its components, and so can carry out such calculations much more easily. Diamond, of all things, could take the hardness out of the problem.
<urn:uuid:feaf47dc-10d5-4a0d-8ce0-dcfdfb3ffea6>
CC-MAIN-2014-49
http://www.bbc.com/future/story/20130218-diamond-idea-for-quantum-computer
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400376197.4/warc/CC-MAIN-20141119123256-00095-ip-10-235-23-156.ec2.internal.warc.gz
en
0.951171
1,258
3.546875
4
Binary refers to any system that uses two alternative states, components, conditions or conclusions. The binary, or base 2, numbering system uses combinations of just two unique numbers, i.e., zero and one, to represent all values, in contrast with the decimal system (base 10), which uses combinations of ten unique numbers, i.e., zero through nine. Virtually all electronic computers are designed to operate internally with all information encoded in binary numbers. This is because it is relatively simple to construct electronic circuits that generate two distinct voltage levels (i.e., off and on or low and high) to represent zero and one. The reason is that transistors and capacitors, which are the fundamental components of processors (the logic units of computers) and memory, generally have only two distinct states: off and on. The values of bits are stored in various ways, depending on the medium. For example, the value of each bit is stored as an electrical charge in a single capacitor within a RAM (random access memory) chip. It is stored as the magnetization of a microscopic area of magnetic material on a platter in a hard disk drive (HDD) or on a floppy disk. It is stored along the spiral track on an optical disk as a change from a pit to the surface or from the surface to a pit (representing a one) and as no change (representing a zero). Computers are almost always designed to store data and execute instructions in larger and more meaningful units called bytes, although they usually also provide ways to test and manipulate single bits. Bytes are abbreviated with an upper case B, and bits are abbreviated with a lower case b. The number of bits in a byte varied according to the manufacturer and model of computer in the early days of computing, but today virtually all computers use bytes that consist of eight bits. Whereas a bit can have only one of two values, an eight-bit byte can have any of 256 possible values, because there are 256 possible permutations (i.e., combinations of zero and one) for eight consecutive bits (i.e., 28). Thus, an eight-bit byte can represent any unsigned integer from zero through 255 or any signed integer from -128 to 127. It can also represent any character (i.e., letter, number, punctuation mark or symbol) in a seven-bit or eight-bit character encoding system (such as ASCII, the default character encoding used on most computers). The number of bits is often used to classify generations of computers and their components, particularly CPUs (central processing units) and busses and to provide an indication of their capabilities. However, such terminology can be confusing or misleading when used in an imprecise manner, which it frequently is. For example, classifying a computer as a 32-bit machine might mean that its data registers are 32 bits wide, that it uses 32 bits to identify each address in memory or that its address buses or data buses of that size. A register is a very small amount of very fast memory that is built into the CPU in order to speed up its operations by providing quick access to commonly used values. Whereas using more bits for registers makes computers faster, using more bits for addresses enables them to support larger programs. A bus is a set of wires that connects components within a computer, such as the CPU and the memory. A 32-bit bus transmits 32 bits in parallel (i.e., simultaneously rather than sequentially). Although CPUs that treat data in 32-bit chunks (i.e., processors with 32-bit registers and 32-bit memory addresses) still constitute the personal computer mainstream, 64-bit processors are common in high-performance servers and are now being used in an increasing number of personal computers as well. The rate of data transfer in computer networks and telecommunications systems is referred to as the bit rate or bandwidth, and it is usually measured in terms of some multiple of bits per second, abbreviated bps, such as kilobits, megabits or gigabits (i.e., billions of bits) per second. A bitmap is a method of storing graphics (i.e., images) in which each pixel (i.e., dot that is used to form an image on a display screen) is stored as one or several bits. Graphics are also often described in terms of bit depth, which is the number of bits used to represent each pixel. A single-bit pixel is monochrome (i.e., either black or white), a two-bit pixel can represent any of four colors (or black and white and two shades of gray), an eight bit pixel can represent 256 colors and 24-bit and 32-bit pixels support highly realistic color which is referred to as true color. The word bit was invented in the latter half of the 1940s by John W. Tukey (1915-2000), an eminent statistician, while working at Bell Labs (the research arm of AT&T, the former U.S. telecommunications monopoly). He coined it as a contraction of the term binary digit and as a handier alternative to bigit or binit. Tukey also coined the word software. The term bit was first used in an influential publication by Claude E. Shannon (1916-2001), also while at Bell Labs, in his seminal 1948 paper A Mathematical Theory of Communication. Shannon, widely regarded as the father of information theory, developed a theory that for the first time treated communication as a rigorously stated mathematical problem and provided communications engineers with a technique for determining the capacities of communications channels in terms of of bits. Although the bit has been the smallest unit of storage used in computing so far, much research is being conducted on qubits, the basic unit of information in quantum computing (which is based on phenomena that occur at the atomic and subatomic levels). Qubits hold an exponentially greater amount of information than conventional bits. Created March 4, 2005. Updated April 5, 2006.
<urn:uuid:8f9c3539-8f3a-414e-a900-003891bd008c>
CC-MAIN-2014-49
http://www.linfo.org/bit.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400378429.52/warc/CC-MAIN-20141119123258-00043-ip-10-235-23-156.ec2.internal.warc.gz
en
0.950656
1,236
4.03125
4
Few modern materials have achieved the fame of silicon, a key element of computer chips. The next generation of computers, however, may not rely so much on silicon. University at Buffalo researchers are among scientists working to identify materials that could one day replace silicon to make computing faster. Their latest find: A vanadium oxide bronze whose unusual electrical properties could increase the speed at which information is transferred and stored. This week, design company 4DSP has launched live industry demonstrations of licensed NASA fiber optic sensing and 3D shape rendering technology. Past fiber optic sensing solutions have been limited by both processing speed and high deployment costs, and 4DSP expects the new technology to offer a 20-fold improvement in performance. According to data from a 2008 Business R&D and Innovation Survey by the National Science Foundation, businesses perform the lion's share of their R&D activity in just a small number of geographic areas, particularly the San Jose-San Francisco-Oakland area and the New York-Newark-Bridgeport area. A professor from Tel Aviv University is reconfiguring existing complementary metal-oxide-semiconductor (CMOS) chips designed for computers and turning them into high-frequency circuits. The ultimate goal of this project is to produce chips with radiation capabilities that are able to see through packaging and clothing to produce an image of what may be hidden beneath. A new, Massachusetts Institute of Technology-developed analytical method identifies the precise binding sites of transcription factors—proteins that regulate the production of other proteins—with 10 times the accuracy of its predecessors. A European research team has recently been able to demonstrate that germanium, under certain conditions, can function as a laser material. Together with silicon, the researchers report, germanium lasers could form the basis for innovative computer chips in which information would be transferred partially in the form of light. Researchers from North Carolina State University have developed a new software tool to prevent performance disruptions in cloud computing systems by automatically identifying and responding to potential anomalies before they can develop into problems. Computers may be getting faster every year, but those advances in computer speed could be dwarfed if their 1s and 0s were represented by bursts of light, instead of electricity. Researchers at the University of Pennsylvania have made an important advance in this frontier of photonics, fashioning the first all-optical photonic switch out of cadmium sulfide nanowires. Scientists from the University of Aberdeen's Marine Biodiscovery Center and the University of St Andrews presented their work on the components of a new type of computer chip created using molecules from a sea squirt sourced from the bottom of the Great Barrier Reef. In Finland, researchers have experimentally determined the conditions for rebounding of water droplets moving on superhydrophobic surfaces. Like billiard balls, these droplets move by way of collisions, allowing the scientists to build “droplet logic”. When combined with chemical reactions these devices demonstrate elementary Boolean logic operations. Particular sequences of the familiar double helix structure of DNA form genes, which tell cells how to make proteins. But the vast majority of DNA lies outside of genes and is poorly understood. A massive project by more than 500 scientists to gain a comprehensive look at how our DNA works has produced an encyclopedia of information that reveals extraordinarily complex networks that tell our genes what to do. It also reveals just how much of the human genome is active. A refined method developed at NIST for measuring nanometer-sized objects may help computer manufacturers more effectively size up the myriad tiny switches packed onto chips' surfaces. The method, which makes use of multiple measuring instruments and statistical techniques, is already drawing attention from industry. Only about 1% of the human genome contains gene regions that code for proteins, raising the question of what the rest of the DNA is doing. Scientists have now begun to discover the answer: About 80% of the genome is biochemically active, and likely involved in regulating the expression of nearby genes, according to a study from a large international team of researchers. Over the past few decades, the hunt for extrasolar planets has yielded incredible discoveries. Now, planetary researchers have a new tool—simulated models of how planets are born. A team of researchers at The University of Texas at Austin are using supercomputers to model and simulate the protostellar disks that precede the formation of planet. Disorders such as schizophrenia can originate in certain regions of the brain and then spread out to affect connected areas. Identifying these regions of the brain, and how they affect the other areas they communicate with, would allow drug companies to develop better treatments and could ultimately help doctors make a diagnosis. But interpreting the vast amount of data produced by brain scans to identify these connecting regions has so far proved impossible, until now. An international research collaboration led by scientists in the U.K. has developed a new approach to quantum computing that could lead more widespread use of new quantum technologies. The breakthrough has been a move from glass-based circuitry that allowed circuits to manipulate photons to a silicon-based technology that accomplishes the same calculations using quantum mechanical effects. Most major Websites maintain huge databases. Almost any transaction on a shopping site, travel site, or social networking site require multiple database queries, which can slow response time. Now, researchers at Massachusetts Institute of Technology have developed a system that automatically streamlines Websites' database access patterns, making the sites up to three times as fast. Researchers from the Australian National University have taken a quantum leap towards developing the next generation of super-fast networks needed to drive future computers. The team has developed a technique that allows for quantum information to travel at higher bandwidth using a beam of light and the phenomenon called entanglement. On Tuesday IBM introduced a new line of mainframe computers the company calls its most powerful and technologically advanced ever. The zEnterprise EC12 mainframe server is designed to help users securely and quickly sift through massive amounts of data. Running at 5.5 GHz, IBM said the microprocessor that powers the mainframe is the fastest chip in the world. A critical element in any microchip is an inverter—an electronic component that spits out zeros when it is given ones, and vice versa. Complementary metal-oxide-semiconductor, or CMOS, is the industry standard for this type of component, but still requires billions of dollars to achieve production scale. Researchers have recently pioneered a room-temperature additive process that creates a nanoscale inverter quickly and at low cost. Cancer metastasis, the escape and spread of primary tumor cells, is a common cause of cancer-related deaths. But metastasis remains poorly understood, and only recently have studies indicate that blood’s “stickiness” actually tears off tumor cells. Using a statistical technique employed by animators, scientists created a new computer simulation that reveals how cancer cells enter the bloodstream and the physical forces involved. At outdoor athletic competitions?at the Olympic Games, for example?athletes pushed themselves to the limit. But it’s hard to depict this in pictures alone. Researchers at the Fraunhofer Institute in Germany have created an intelligent camera that instantly delivers more complete picture of the action, supplying additional metadata acceleration, temperature, or heart rate. Researchers at the Stanford University School of Medicine and Intel Corp. have collaborated to synthesize and study a grid-like array of short pieces of a disease-associated protein on silicon chips normally used in computer microprocessors. Used recently to identify patients with a severe form of lupus, the new technology has the potential to improve diagnoses of a multitude of diseases. A research team at the University of Santa Barbara has designed and fabricated a quantum processor capable of factoring a composite number—in this case the number 15—into its constituent prime factors, 3 and 5. Although modest compared to, say, a 600-digit number, the algorithm they developed was right about half the time, matching theoretical predictions and marking a milestone on the trail of building a stronger quantum computer. Using next-generation sequencing technology and a new strategy to encode 1,000 times the largest data size previously achieved in DNA, Harvard University geneticist George Church has encoded his book in life's language. While the volume of data is comparatively modest, the density of 5.5 petabits, or 1 million gigabits per cubic meter, is off the charts.
<urn:uuid:27469c88-eed9-4834-a265-6d5442964b86>
CC-MAIN-2014-49
http://www.rdmag.com/topics/industries/computers-peripherals?items_per_page=25&page=7
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400382386.21/warc/CC-MAIN-20141119123302-00191-ip-10-235-23-156.ec2.internal.warc.gz
en
0.92575
1,722
3.828125
4
An artist's rendering of a molecular defect predicted to be a good qubit for quantum computing. Credit: courtesy of J. R. Weber et al., and rendered by Peter Allen This Behind the Scenes article was provided to LiveScience in partnership with the National Science Foundation. Quantum computers may represent the next major paradigm shift in technology. In theory, such computers could perform faster and more complex computations using a fraction of the energy. However, in practice, building a quantum computer is a very tricky engineering challenge. At the atomic level, particles do not behave in a way one would expect from the laws of classical physics. According to the Heisenberg uncertainty principle, it is impossible to precisely determine the speed and location of a particle at any given moment. Instead, particles are characterized by a wave function that represents a probability that the particle will be in a given physical state. In quantum computing, instead of 0s and 1s, information is encoded in that wave function and the infinite variations that are possible in the spectrum of the wave. "You have a lot more flexibility in setting the values of the things that you compute," said Chris Van de Walle, who, as a professor at the University of California, Santa Barbara, studies potential quantum systems. "You could have any continuous value that is being encoded in the wave function of some entity that you are now using as your fundamental unit of computing." If it sounds far-out, it is. Quantum bits are a basic unit of information representing either a 1 or 0, and in quantum computing, a qubit can represent 1 and 0 at the same time. Over the last decade, researchers have investigated various ways of designing a practical implementation of a quantum bit (or, qubit). None are near completion. "If you can come up with such qubits and incorporate them in the computing architecture, it has been shown theoretically that you can solve problems computationally that are currently not feasible," Van de Walle said. "The big challenge is to come up with specific implementations of these qubits." One of the most promising implementations involves a defect in diamonds that leads to a missing carbon in the material's matrix, with a rogue nitrogen atom located nearby. This altered structure creates a hole, or vacancy — called an NV (nitrogen vacancy) center — with a specific wave function that many believe can be effectively manipulated for quantum computing. In industry, defects are a negative. But when it comes to materials for quantum computing, it is the defect that makes computation possible. "The defect is actually a good actor," Van de Walle said. "It's the qubit that you want to use as your unit of computation." The biggest advantage of NV centers in diamonds is their ability to operate at room temperature, rather than requiring near-absolute-zero temperatures, as other quantum computing systems do. Electrons in the NV center also can remain coherent for a long time and be manipulated by outside forces. "You can control where the vacancy is formed in the crystal and you can probe it very accurately with laser beams with a specific wave length," Van de Walle said. Van de Walle, an expert in defects and impurities, has been working closely with David Awschalom, an experimentalist at UC Santa Barbara and a quantum computing expert, to expose the atomic-level dynamics of the diamond center. Van de Walle's computational simulations on the National Science Foundation-supported Ranger supercomputer at the Texas Advanced Computing Center matched experimental results for the NV center. The simulations also added a few crucial pieces of information about the NV center. In particular, they found that the defect's charge state plays a crucial role in achieving a useable wavelength. This means one must control material doping in order to control the number of electrons that can enter a vacancy. "For NV centers in diamonds, the optimal charge state is a negative one charge state," Van de Walle said. "For defects in other materials, it may be a different charge state, and just by guessing the charge state, you wouldn't be able to know if it's a good choice. But that's what we can calculate." Simulating the quantum mechanical interactions of hundreds of atoms requires thousands of computer processors working in tandem for days. "Without the ability to run on Texas Advanced Computing Center's supercomputers, we would simply not have been able to do this project," Van de Walle said. The high-fidelity quantum simulations inspire confidence among the researchers' experimental collaborators and generate new ideas for lab experiments. "The ability to take our expertise in the area of defects and to use it creatively to design defects with certain properties is really great," Van de Walle said. "It's exciting to be able to dig into what we know about defects and use all of that knowledge to construct a defect with a given property." Editor's Note: The researchers depicted in Behind the Scenes articles have been supported by the National Science Foundation, the federal agency charged with funding basic research and education across all fields of science and engineering. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. See the Behind the Scenes Archive.
<urn:uuid:a6efa275-d851-4e2e-a3e1-4f943cac78e7>
CC-MAIN-2014-49
http://www.livescience.com/18971-defects-quantum-computer.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400380574.41/warc/CC-MAIN-20141119123300-00016-ip-10-235-23-156.ec2.internal.warc.gz
en
0.93691
1,074
3.765625
4
Computer networks topology - Types of networking topologies. Illustration of Different Network Topologies - What is Topology? The virtual shape or structure of a network is referred as topology. It is worth remembering that this virtual design does not correspond to the actual or the physical shape of the computer networks: you could arrange the home network in a circle but it does not replicate Ring Topology. The logical or/and physical connections between nodes could be mapped graphically for determining a network topology. Graph Theory is used for studying network topology: nodes’ distance, interconnectivity, the rate of transmission and signal’s types of two networks might vary but their topologies could be identical. The Technical Connotation of Topology The pattern or layout of interconnections of different elements or nodes of a computer network is a network topology that might be logical or physical. As opposed to physical design, the transfer of data in a network is referred in Logical Topology (the basic network) where the Physical Topology (the core network) accounts the physical structure of a network that carries devices, cable installations and locations. LAN (local area network) is an example of network that keeps both logical and physical topologies. What are the Basic Types of Topology? There are seven basic types of network topologies in the study of network topology: Point-to-point topology, bus (point-to-multipoint) topology, ring topology, star topology, hybrid topology, mesh topology and tree topology. The interconnections between computers whether logical or physical are the foundation of this classification. Logical topology is the way a computer in a given network transmits information, not the way it looks or connected, along with the varying speeds of cables used from one network to another. On the other hand the physical topology is affected by a number of factors: troubleshooting technique, installation cost, office layout and cables’ types. The physical topology is figured out on the basis of a network’s capability to access media and devices, the fault tolerance desired and the cost of telecommunications circuits. The classification of networks by the virtue of their physical span is as follows: Local Area Networks (LAN), Wide Area Internetworks (WAN) and Metropolitan Area Networks or campus or building internetworks. How Is the Physical Topology Classified? Point-to-Point Network Topology It is the basic model of typical telephony. The simplest topology is a permanent connection between two points. The value of a demanding point-to-point network is proportionate to the number of subscribers’ potential pairs. It is possible to establish a permanent circuit within many switched telecommunication systems: the telephone present in a lobby would always connect to the same port, no matter what number is being dialed. A switch connection would save the cost between two points where the resources could be released when no longer required. Bus Network Topology LANs that make use of bus topology connects each node to a single cable. Some connector connects each computer or server to the bus cable. For avoiding the bouncing of signal a terminator is used at each end of the bus cable. The source transmits a signal that travels in both directions and passes all machines unless it finds the system with IP address, the intended recipient. The data is ignored in case the address is unmatched. The installation of one cable makes bus topology an inexpensive solution as compared to other topologies; however the maintenance cost is high. If the cable is broken all systems would collapse. Linear Bus: If all network nodes are connected to a combine transmission medium that has two endpoints the Bus is Linear. The data transmitted between these nodes is transmitted over the combine medium and received by all nodes simultaneously. Distributed Bus: If all network nodes are connected to a combine transmission medium that has more than two endpoints created by branching the main section of the transmitting medium. Star Network Topology The topology when each network host is connected to a central hub in LAN is called Star. Each node is connected to the hub with a point-to-point connection. All traffic passes through the hub that serves as a repeater or signal booster. The easiest Star topology to install is hailed for its simplicity to add more nodes but criticized for making hub the single point of failure. The network could be BMA (broadcast multi-access) or NBMA (non-broadcast multi-access) depending on whether the signal is automatically propagated at the hub to all spokes or individually spokes with those who are addressed. - Extended Star: A network that keeps one or more than one repeaters between the central node or hub and the peripheral or the spoke node, supported by the transmitter power of the hub and beyond that supported by the standard of the physical layer of the network. - Distributed Star: The topology is based on the linear connectivity that is Daisy Chained with no top or centre level connection points. Ring Network Topology Such physical setting sets up nodes in a circular manner where the data could travel in one direction where each device on the right serves as a repeater to strengthen the signal as it moves ahead. Mesh Network Topology The exponent of the number of subscribers is proportionate to the value of the fully meshed networks. - Fully Connected: For practical networks such topology is too complex and costly but highly recommended for small number of interconnected nodes. - Partially Connected: This set up involves the connection of some nodes to more than one nodes in the network via point-to-point link. In such connection it is possible to take advantage of the redundancy without any complexity or expense of establishing a connection between each node. Tree Network Topology the top level of the hierarchy, the central root node is connected to some nodes that are a level low in the hierarchy by a point-to-point link where the second level nodes that are already connected to central root would be connected to the nodes in the third level by a point-to-point link. The central root would be the only node having no higher node in the hierarchy. The tree hierarchy is symmetrical. The BRANCHING FACTOR is the fixed number of nodes connected to the next level in the hierarchy. Such network must have at least three levels. Physical Linear Tree Topology would be of a network whose Branching Factor is one. Knowledge of networking topologies is of core importance of computer networking design. Computer networks can only be developed using the knowledge about these topoliges and decide to which topology design is best suited according to the requirement. Interested in Advertising your products or website with us? Click Why Advertising with us ? Other Improtant topics Computer Network Architechture :: Data recovery :: What is Data Mining & techniques :: Security issues of Computer :: Frame Relay :: How to create wireless groups :: How to design security policy for network :: How to Troubleshoot LAN :: How to Troubleshoot WLAN :: Infrared Network :: Introduction to Active Directory :: Network Management Software :: Network ports List :: Network Security Software :: Networking FAQ :: Online Security Threat :: Satellite Communication :: Submarine Communication Cable :: Telecommunication Networks :: WAN Technology :: What is Cryptography :: What is Optical Router :: Working Of Telnet :: Linux Server Adminstatrion :: Wireless Bridges set up techniques :: Digital Communication :: How to Configure Linksys wireless bridge :: How to setup wireless repeater :: Distributed Computing :: Hight Performance Computing :: Parallel computing :: Quantum Computing :: Super Computing :: Cloud Computing :: How to configure print server :: How video conferencing works :: Setting up TCP/IP network :: Recover lost hard drive data :: How to solve network performance problems :: 3GPP2 Multimedia Domain Architecture :: Network management model and architechture :: What is protocol analysis & Analyzer :: What is network address translator :: Internet network architecture :: Types of information technology :: What is DSL technology :: Dsl concept :: Dsl vs Cable internet :: Network simulator :: Next generation networks :: What is Switched mesh :: What is 127.0.0.1 :: How to change mac address :: How to flush dns :: EV-DO Rev. B Technology? :: What is network protocol :: What is ASIC :: Blu ray Technology :: Field Program Gate Array (FPGA) :: Computer networking with ethernet hub :: Intelligent networks :: Adsl problems and oppertunities :: Dsl components :: What is hub :: What is networking switch :: Hubs Vs Switches :: Frame relay networks Browse All Categories - WiFi Technology - Wimax Technology - Computer Networks - Mobile Communication - IT - Certifications - Computer OS - Computer Hardware - Computer security - Technology Reviews - Networking Tutorials - Other Technology articles - Top 10 - Holiday Season Lastest articles in Category
<urn:uuid:b8c7421a-93b3-4747-8ef9-1c2f04b3a371>
CC-MAIN-2014-49
http://www.wifinotes.com/computer-networks/network-topology.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400379512.32/warc/CC-MAIN-20141119123259-00179-ip-10-235-23-156.ec2.internal.warc.gz
en
0.899451
1,833
4.03125
4
Quantum technologies are the way of the future, but will that future ever arrive? Maybe so. Physicists have cleared a bit more of the path to a plausible quantum future by constructing an elementary network for exchanging and storing quantum information. The network features two all-purpose nodes that can send, receive and store quantum information, linked by a fiber-optic cable that carries it from one node to another on a single photon. The network is only a prototype, but if it can be refined and scaled up, it could form the basis of communication channels for relaying quantum information. A group from the Max Planck Institute of Quantum Optics (M.P.Q.) in Garching, Germany, described the advance in the April 12 issue of Nature. (Scientific American is part of Nature Publishing Group.) Quantum bits, or qubits, are at the heart of quantum information technologies. An ordinary, classical bit in everyday electronics can store one of two values: a 0 or a 1. But thanks to the indeterminacy inherent to quantum mechanics, a qubit can be in a so-called superposition, hovering undecided between 0 and 1, which adds a layer of complexity to the information it carries. Quantum computers would boast capabilities beyond the reach of even the most powerful classical supercomputers, and cryptography protocols based on the exchange of qubits would be more secure than traditional encryption methods. Physicists have used all manner of quantum objects to store qubits—electrons, atomic nuclei, photons and so on. In the new demonstration, the qubit at each node of the network is stored in the internal quantum state of a single rubidium atom trapped in a reflective optical cavity. The atom can then transmit its stored information via an optical fiber by emitting a single photon, whose polarization state carries the mark of its parent atom's quantum state; conversely, the atom can absorb a photon from the fiber and take on the quantum state imprinted on that photon's polarization. Because each node can perform a variety of functions—sending, receiving or storing quantum information—a network based on atoms in optical cavities could be scaled up simply by connecting more all-purpose nodes. "We try to build a system where the network node is universal," says M.P.Q. physicist Stephan Ritter, one of the study's authors. "It's not only capable of sending or receiving—ideally, it would do all of the things you could imagine." The individual pieces of such a system had been demonstrated—atoms sending quantum information on single emitted photons, say—but now the technologies are sufficiently advanced that they can work as an ensemble. "This has now all come together and enabled us to realize this elementary version of a quantum network," Ritter says. Physicists proposed using optical cavities for quantum networks 15 years ago, because they marry the best features of atomic qubits and photonic qubits—namely that atoms stay put, making them an ideal storage medium, whereas photons are speedy, making them an ideal message carrier between stationary nodes. But getting the photons and atoms to communicate with one another has been a challenge. "If you want to use single atoms and single photons, as we do, they hardly interact," Ritter adds. That is where the optical cavity comes in. The mirrors of the cavity reflect a photon past the rubidium atom tens of thousands of times, boosting the chances of an interaction. "During this time, there's enough time to really do this information exchange in a reliable way," Ritter says. "The cavity enhances the coupling between the light field and the atom." The M.P.Q. group put their prototype network through a series of tests—transferring a qubit from a single photon to a single atom and reversing the process to transfer information from an atom onto a photon. Combining those read/write operations, the physicists managed to transmit a qubit from one rubidium atom to another located in a separate laboratory 21 meters away, using a messenger photon as the carrier between nodes. (The actual length of optical fiber connecting the two nodes is 60 meters, because it snakes along an indirect route.) A significant number of the photons get lost along the way, limiting the efficiency of the process. But in principle, optical fibers could connect nodes at greater distances. "We're absolutely not limited to these 21 meters," Ritter says. "This 21 meters is just the distance that we happened to have between the two labs." The researchers also demonstrated that their photonic link can be used to entangle the two distant atoms. Quantum entanglement is a phenomenon by which two particles share correlated properties—in other words, the quantum state of one particle depends on the state of its entangled partner. Manipulating one of the particles, then, affects the other particle's state, even if it is located in another laboratory. Researchers hope that entanglement can be harnessed to circumvent the photon losses that come from passage through optical fibers. In a proposed application called a quantum repeater, a series of nodes, linked by entanglement, would extend the quantum connection down the line without depending on any one photon as the carrier. Ritter acknowledges that the new work is simply a prototype, and one for which numerous improvements are possible. For instance, the transfer of a quantum state between labs succeeded only 0.2 percent of the time, owing to various inefficiencies and technical limitations. "Everything is at the edge of what can be done," he says. "All these characteristics are good enough to do what we've done, but there are clear strategies to pursue to make them even better."
<urn:uuid:de9044c9-55a1-4a26-a520-1ff98872c910>
CC-MAIN-2014-49
http://www.scientificamerican.com/article/universal-quantum-network/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931009968.66/warc/CC-MAIN-20141125155649-00069-ip-10-235-23-156.ec2.internal.warc.gz
en
0.942877
1,158
3.796875
4
Jan. 18, 2001 — Physicists say they can effectively catch a light pulse in a bottle, hold onto it and release it, in an operation described as slowing light to a dead stop. It’s actually the information about the light wave that’s being captured, the researchers say, and such techniques could be applied to a future generation of quantum computers and ultrasecure communication devices. Light normally moves through a vacuum at about 186,000 miles per second. Nothing in the universe moves faster, and Albert Einstein theorized that nothing ever could. (Click here for some caveats.) However, light waves can slow down as they pass through a medium. Last year, a research team at the Rowland Institute for Science and Harvard University, headed by Danish physicist Lene Hau, brought light waves down to a 1 mph crawl by putting them through a specially prepared haze of ultracold sodium atoms. Now the same group and another team at the Harvard-Smithsonian Center for Astrophysics, led by Ronald Walsworth and Mikhail Lukin, say they have “stored” pulses of light in separate experiments. The Harvard-Smithsonian results are being published in the Jan. 29 issue of Physical Review Letters. The Rowland-Harvard findings will appear in the Jan. 25 issue of Nature, which is not yet publicly available. However, the Nature research was released to the media on Thursday, due to the reports about the other study. Both teams accomplished what sounds like an impossible task: slowing down a light pulse so much that it appears to fade and stop, then starting it up again on demand. However, Lukin and a colleague, David Phillips, told MSNBC.com that the process is less crazy and more complicated than it sounds. The experiments don’t involve stopping the actual photons, or particles of light. Instead, information about the light wave is gradually transferred to specially prepared atoms trapped within a glass chamber, and then turned back into a replica of the original light wave. That’s the real trick. The Harvard-Smithsonian team used warm atoms of rubidium gas, while the Rowland-Harvard researchers used the chilled haze of sodium atoms that worked so well in their previous experiments. In each experiment, one laser beam excites the atoms in such a way that they can’t absorb light in a traditional sense, a process called electromagnetically induced transparency. Then another laser emits a pulse of light toward the chamber. When the pulse enters the chamber, the photons and the excited atoms are coupled into quantum systems called polaritons. During this coupling, the properties of the photons are transferred to the atoms, changing or “flipping” their magnetic spin. In a sense, the atoms weigh down the photons, and that drags down the speed of the pulse. When the pulse is fully within the haze of excited atoms, the intensity of the first laser beam is reduced to zero. “As we decelerate, the pulse has less and less photons, and at the same time there are more and more excited spins. So when we make the light go infinitely slow ... there are no photons remaining, all of the information is in the spins,” Lukin said. He stressed that the photons are not absorbed, as they would be under normal conditions. “The photon disappears, but when one photon disappears, one spin flips,” he said. More from TODAY.com Actress Angela Leslie accuses Bill Cosby of sexual assault Another woman has stepped forward to accuse actor and comedian Bill Cosby of sexual assault. Actress Angela Leslie, 52, sa... - Watch this 4th grader get the ultimate surprise: A reunion with dad - Meet the blind 13-year-old wrestler inspiring his teammate and family - Woman spends $35K to find lost dog - Watch Idina Menzel, Michael Bublé's adorable 'Baby it's Cold Outside' - Actress Angela Leslie accuses Bill Cosby of sexual assault Hau’s team said the pulse was “frozen” — essentially stored as a quantum pattern imprinted upon the atoms. In each experiment, the information about the light pulse can be stored for about a thousandth of a second before it starts to decay. When the control laser beam is turned back on, photons are once again introduced into the system. The light pulse starts speeding up again, reaching its original velocity by the time it leaves the chamber. “Essentially what we get is an exact replica, in the ideal case,” Lukin said. Both groups said their findings could be applied to a weird technological frontier known as quantum computing. “What’s a big deal is that you really stored this information, and that might have implications for quantum computation and quantum communication,” Lukin said. In fact, the real value of the technique could come from “turning it inside out,” Phillips said. “Instead of starting with a light pulse, you might imagine starting with atoms in a quantum state, and extracting the light pulse to another set of atoms, perhaps only a few inches away or maybe a thousand miles away,” he said. “Hence we will write the quantum state from the original atoms to this new set of atoms, and transmit the quantum information.” Scientists say quantum computing could solve mathematical problems beyond the capability of existing computers, particularly involving code-making and code-breaking. And since quantum information is extremely sensitive to eavesdropping, quantum-based communication systems could provide a new level of data security. One of the pioneers in quantum computing, IBM researcher Charles Bennett, said the efforts to slow down light represented an exciting field of research. The key, he said, is to keep the information in a quantum system free from decay, known in quantum circles as decoherence. “If you could stop (a light pulse) and also stop the loss or at least reduce the loss of coherence, then that would be good,” he said. But Bennett said he could not yet judge how these particular light-stopping experiments would affect his field. In a commentary written for Nature, University of Colorado physicist Eric Cornell compared the experiments to a grand trick in which the magician makes a speeding train suddenly disappear into a sheet of gossamer fabric — and then, seconds later, just as suddenly roar out the other side. Cornell said it wasn’t yet clear whether the experiments would truly have technological relevance to the quest for quantum computing. “But for now it hardly matters — trainspotting doesn’t get any more interesting than this,” he said. © 2013 msnbc.com Reprints
<urn:uuid:eb24bf4b-9ea8-4c2e-8967-3cdd26bdccb0>
CC-MAIN-2014-49
http://www.today.com/id/3077366/ns/technology_and_science-science/
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400378862.11/warc/CC-MAIN-20141119123258-00047-ip-10-235-23-156.ec2.internal.warc.gz
en
0.937777
1,397
3.703125
4
When in 1935 physicist Erwin Schrödinger proposed his thought experiment involving a cat that could be both dead and alive, he could have been talking about D-Wave Systems. The Canadian start-up is the maker of what it claims is the world’s first commercial-scale quantum computer. But exactly what its computer does and how well it does it remain as frustratingly unknown as the health of Schrödinger’s poor puss. D-Wave has succeeded in attracting big-name customers such as Google and Lockheed Martin Corp. But many scientists still doubt the long-term viability of D-Wave’s technology, which has defied scientific understanding of quantum computing from the start. D-Wave has spent the last year trying to solidify its claims and convince the doubters. “We have the world’s first programmable quantum computer, and we have third-party results to prove it computes,” says Vern Brownell, CEO of D-Wave. But some leading experts remain skeptical about whether the D-Wave computer architecture really does quantum computation and whether its particular method gives faster solutions to difficult problems than classical computing can. Unlike ordinary computing bits that exist as either a 1 or a 0, the quantum physics rule known as superposition allows quantum bits (qubits) to exist as both 1 and 0 at the same time. That means quantum computing could effectively perform a huge number of calculations in parallel, allowing it to solve problems in machine learning or figure out financial trading strategies much faster than classical computing could. With that goal in mind, D-Wave has built specialized quantum-computing machines of up to 512 qubits, the latest being a D-Wave Two computer purchased by Google for installation at NASA’s Ames Research Center in Moffett Field, Calif. D-Wave has gained some support from independent scientific studies that show its machines use both superposition and entanglement. The latter phenomenon allows several qubits to share the same quantum state, connecting them even across great distances. But the company has remained mired in controversy by ignoring the problem of decoherence—the loss of a qubit’s quantum state, which causes errors in quantum computing. “They conjecture you don’t need much coherence to get good performance,” says John Martinis, a professor of physics at the University of California, Santa Barbara. “All the rest of the scientific community thinks you need to start with coherence in the qubits and then scale up.” Most academic labs have painstakingly built quantum-computing systems—based on a traditional logic-gate model—with just a few qubits at a time in order to focus on improving coherence. But D-Wave ditched the logic-gate model in favor of a different method called quantum annealing, also known as adiabatic quantum computing. Quantum annealing aims to solve optimization problems that resemble landscapes of peaks and valleys, with the lowest valley representing the optimum, or lowest-energy, answer. Classical computing algorithms tackle optimization problems by acting like a bouncing ball that randomly jumps over nearby peaks to reach the lower valleys—a process that can end up with the ball getting trapped when the peaks are too high. Quantum annealing takes a different and much stranger approach. The quantum property of superposition essentially lets the ball be everywhere at once at the start of the operation. The ball then concentrates in the lower valleys, and finally it can aim for the lowest valleys by tunneling through barriers to reach them. That means D-Wave’s machines should perform best when their quantum-annealing system has to tunnel only through hilly landscapes with thin barriers, rather than those with thick barriers, Martinis says. Independent studies have found suggestive, though not conclusive, evidence that D-Wave machines do perform quantum annealing. One such study—with Martinis among the coauthors—appeared in the arXiv e-print service this past April. Another study by a University of Southern California team appeared in June in Nature Communications. But the research also shows that D-Wave’s machines still have yet to outperform the best classical computing algorithms—even on problems ideally suited for quantum annealing. “At this point we don’t yet have evidence of speedup compared to the best possible classical alternatives,” says Daniel Lidar, scientific director of the Lockheed Martin Quantum Computing Center at USC, in Los Angeles. (The USC center houses a D-Wave machine owned by Lockheed Martin.) What’s more, D-Wave’s machines have not yet demonstrated that they can perform significantly better than classical computing algorithms as problems become bigger. Lidar says D-Wave’s machines might eventually reach that point—as long as D-Wave takes the problem of decoherence and error correction more seriously. The growing number of independent researchers studying D-Wave’s machines marks a change from past years when most interactions consisted of verbal mudslinging between D-Wave and its critics. But there’s still some mud flying about, as seen in the debate over a May 2013 paper [PDF] that detailed the performance tests used by Google in deciding to buy the latest D-Wave computer. Catherine McGeoch, a computer scientist at Amherst College, in Massachusetts, was hired as a consultant by D-Wave to help set up performance tests on the 512-qubit machine for an unknown client in September 2012. That client later turned out to be a consortium of Google, NASA, and the Universities Space Research Association. Media reports focused on the fact that D-Wave’s machine had performed 3600 times as fast as commercial software by IBM. But such reporting overlooked McGeoch’s own warnings that the tests had shown only how D-Wave’s special-purpose machine could beat general-purpose software. The tests had not pitted D-Wave’s machines against the best specialized classical computing algorithms. “I tried to point out the impermanency of that [3600x] number in the paper, and I tried to mention it to every reporter that contacted me, but apparently not forcefully enough,” McGeoch says. Indeed, new classical computing algorithms later beat the D-Wave machine’s performance on the same benchmark tests, bolstering critics’ arguments. “We’re talking about solving the one problem that the D-Wave machine is optimized for solving, and even for that problem, a laptop can do it faster if you run the right algorithm on it,” says Scott Aaronson, a theoretical computer scientist at MIT. Aaronson worries that overblown expectations surrounding D-Wave’s machines could fatally damage the reputation of quantum computing if the company fails. Still, he and other researchers say D-Wave deserves praise for the engineering it has done. The debate continues to evolve as more independent researchers study D-Wave’s machines. Lockheed Martin has been particularly generous in making its machine available to researchers, says Matthias Troyer, a computational physicist at ETH Zurich. (Troyer presented preliminary results at the 2013 Microsoft Research Faculty Summit suggesting that D-Wave’s 512-qubit machine still falls short of the best classical computing algorithms.) Google’s coalition also plans to let academic researchers use its D-Wave machine. “The change we have seen in the past years is that by having access to the machines that Lockheed Martin leased from D-Wave, we can engage with the scientists and engineers at D-Wave on a scientific level,” Troyer says. About the Author Brooklyn, N.Y.–based reporter Jeremy Hsu knew the time was right for a story about the Canadian quantum-computer company D-Wave Systems and its controversial claims. “There are finally independent studies that go at these big questions that have been hanging over this company from the start,” he says. “It was time to check in with the quantum-computing community to see if their attitude had changed.” The answer? It’s complicated.
<urn:uuid:788d5193-e5fb-4e44-9d8e-fadf975d8355>
CC-MAIN-2014-49
http://spectrum.ieee.org/computing/hardware/dwaves-year-of-computing-dangerously
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400376197.4/warc/CC-MAIN-20141119123256-00116-ip-10-235-23-156.ec2.internal.warc.gz
en
0.945716
1,697
3.609375
4
If the experiment was meant to silence the critics, it didn’t. Four years ago, an upstart tech company created a stir when it claimed to have built a quantum computer—a thing that, in principle, could solve problems ordinary computers can’t. Physicists from D-Wave Systems in Burnaby, Canada, even put on a demonstration. But other researchers questioned whether there was anything quantum mechanical going on inside the device. Now, the D-Wave team has published data that they say prove quantum phenomena are at work within its chip. But even if that’s true, others still doubt that, as D-Wave researchers claim, the chip can do quantum-mechanical computations. “I think they’re overstating this,” says John Martinis, a physicist at the University of California, Santa Barbara (UCSB). “It’s not obvious that they’ve implemented a quantum algorithm.” Physicists have been trying to develop quantum computers for more than a decade. An ordinary computer deals with bits that encode a 0 or a 1. As first conceived, a quantum computer would use subatomic particles or other quantum objects as “qubits” that could encoded 0, 1, or, thanks to the weird rules of quantum mechanics, both 0 and 1 at the same time. What's more, a string of qubits in that strange state could encode every possible combination of 1 and 0 values at the same time. As a result, a quantum computer could process myriad inputs at once and crack problems that would overwhelm a conventional computer. However, that approach to quantum computing, called the “gate model,” presents many unresolved practical problems, as scientists must maintain and manipulate the delicate quantum state of many qubits. D-Wave researchers have taken a different tack, known as “adiabatic quantum computing” or “quantum annealing.” They begin with a set of noninteracting qubits—in their rig, little rings of superconductor that can carry current either one way or the other or both ways at once—and put the rings in their lowest energy “ground state." To perform the computation, the researchers slowly turn on various interactions among the qubits. If they’ve done things right, then the ground state of the noninteracting system should naturally evolve into the ground state of the interacting system and reveal the answer to the problem encoded in the interactions. In February 2007, D-Wave created a splash when its 16-qubit machine solved several puzzles—although none that a conventional computer couldn’t handle—such as figuring out how to seat guests around a table so that people who dislike each other do not end up side by side. However, “people had serious doubts that this was a true quantum computer,” says Wim van Dam, a theoretical computer scientist at UCSB. Here’s why: The workings of the machine can be thought of as tracing the trajectory of a marble through a changing energy landscape of peaks and valleys as it finds its way to the lowest point—the solution to the problem. A process called quantum tunneling lets the marble burrow from one valley to another. At the same time, however, plain old “thermal fluctuations” also agitate the hypothetical marble and can push it over the ridges in the landscape so that it reaches the lowest valley. That process is not quantum mechanical, van Dam says, so if that’s how the D-Wave computer works, then it cannot be significantly more efficient than an ordinary computer. However, new data show that in fact the qubits in the chip can find their lowest energy state quantum mechanically, D-Wave researchers report this week in Nature. Physicist Mark Johnson and colleagues begin experimenting with a single qubit within their latest 128-qubit chip. Current in the ring can circulate either clockwise or counterclockwise, and those two states represent two dips in a very simple energy landscape. By tuning the qubit and applying a magnetic field, the researchers can raise the height of the ridge between those two states and also tilt the entire landscape to make one dip lower than the other. They can also change the temperature—the source of the pesky thermal fluctuations. The researchers found that the ability of the qubit to get from the higher energy state to the lower one at first decreases as the temperature falls. But below about 45 thousandths of a degree above absolute zero (45 millikelvin), the rate at which the qubit makes the switch levels off. That suggests that even as thermal fluctuations grow too weak to nudge the system over the energy barrier, quantum tunneling remains to allow the qubit through it. The researchers observe a similar phenomenon as a chain of eight qubits with very simple interactions finds its way to its predicted ground state. “The evolution [of the system] is consistent with quantum mechanics and not with classical mechanics,” Johnson says. Martinis has some quibbles. Still, he says, “I think it’s pretty likely that they’ve got tunneling. I’m not 100% sure, but I’m 90% sure.” The results won’t end the controversy over D-Wave's technology, however. Quantum tunneling alone is not enough to make the device significantly faster than a classical computer, van Dam says. To whack through really big computations that would take an infinite amount of time on a classical computer, he says, D-Wave’s chip also has to maintain a kind of delicate synchrony between the individual qubits called coherence. But it’s possible that D-Wave’s qubits lose coherence very quickly to act more or less independently but nonetheless tunnel to their collective ground state. And in that case, the computer can’t hope to be any more efficient than a regular one, van Dam says. Johnson and the D-Wave team are not convinced that coherence is necessary in adiabatic quantum computing. “I think it’s not entirely understood what role coherence plays in quantum annealing,” Johnson says. Martinis says it’s unusual to see a company essentially wager its future on a point of scientific dispute. “In some ways, I kind of respect that it’s a clear corporate strategy,” he says. “On the other hand, I’m not going to invest in their technology because I think they’re wrong.” Stay tuned. Johnson says the D-Wave team members will have more publications to back up their claim that they really have a quantum computer.
<urn:uuid:397e301f-6b3f-4b68-9daa-4ed5ab7275f0>
CC-MAIN-2014-49
http://news.sciencemag.org/physics/2011/05/controversial-computer-least-little-quantum-mechanical?mobile_switch=mobile
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400379404.25/warc/CC-MAIN-20141119123259-00226-ip-10-235-23-156.ec2.internal.warc.gz
en
0.951651
1,396
3.640625
4
The latest news from academia, regulators research labs and other things of interest Posted: Mar 27, 2013 Physicists' technique for cooling molecules may be a stepping stone to quantum computing (Nanowerk News) The next generation of computers promises far greater power and faster processing speeds than today's silicon-based based machines. These "quantum computers" — so called because they would harness the unique quantum mechanical properties of atomic particles — could draw their computing power from a collection of super-cooled molecules. But chilling molecules to a fraction of a degree above absolute zero, the temperature at which they can be manipulated to store and transmit data, has proven to be a difficult challenge for scientists. Eric Hudson"Scientists have been trying to cool molecules for a decade and have succeeded with only a few special molecules," said Eric Hudson, a UCLA assistant professor of physics and the paper's senior author. "Our technique is a completely different approach to the problem — it is a lot easier to implement than the other techniques and should work with hundreds of different molecules." Previous attempts to create ultracold molecules were only effective with one or two specific kinds. Creating a method that can be used with many different molecules would be a major step forward because it is difficult to say which materials might be used in quantum computers or other future applications, Hudson said. By immersing charged barium chloride molecules in an ultracold cloud of calcium atoms, Hudson and his colleagues are able to prevent most of the molecules from vibrating and rotating. Halting the molecules is a necessary hurdle to overcome before they can be used to store information like a traditional computer does. "The goal is to build a computer that doesn't work with zeros and ones, but with quantum mechanical objects," Hudson said. "A quantum computer could crack any code created by a classical computer and transmit information perfectly securely." Hudson's experiment makes molecules extremely cold under highly controlled conditions to reveal the quantum mechanical properties that are hidden under normal circumstances. At room temperature, molecules rocket around, bouncing into each other and exchanging energy. Any information a scientist attempted to store in such a chaotic system would quickly become gibberish. "We isolate these molecular systems in a vacuum, effectively levitating them in the middle of nothing," Hudson said. "This removes them from the rest of the world that wants to make them classical." The quantum mechanical world of subatomic particles deviates from the classical world that we observe with the naked eye because according to quantum mechanics, electrons can only exist at specific energy levels. In a quantum computer made of a collection of single atoms, information might be stored by boosting some atomic electrons to higher energy levels while leaving others at lower energy states. However, these atomic energy states are not stable enough to reliably preserve data, Hudson said. "One of the challenges with atoms is that their energy states are very easily influenced by the outside world," Hudson said. "You make this beautiful quantum state, but then the outside world tries to destroy that information." Instead of saving data in easily disrupted atomic energy states, a more robust way to store information is in the rotational energy states of molecules, Hudson said. A spinning molecule in the lowest energy rotational state could represent a binary one, while a stationary molecule could represent a binary zero. Despite applications for quantum computing and other industries, cooling molecules to extremely low temperatures has proved a challenge. Even the simplest molecule composed of only two atoms is a far more complex system than a single atom. Each molecule vibrates and rotates like a miniature whirling slinky, and all of that movement must be stilled so that the molecule can lose energy and cool down. A new cooling technique To solve the ultracold molecule conundrum, Hudson and his group first created a floating cloud of calcium atoms corralled by incoming laser beams from all directions. This magneto-optical trap keeps the atoms stationary as it cools them to nearly absolute zero. They then use specialized rods with high, oscillating voltages as part of an ion trap to confine a cloud of positively-charged barium chloride molecules within the ultracold ball of calcium atoms to complete the cooling process. For the vibrating, energetic molecules to lose heat, they must spend a significant amount of time in contact with the surrounding ultracold atom cloud. Hudson and his colleagues used barium chloride ions, molecules missing one electron, because charged molecules are easier to trap and cool than their neutral counterparts. The use of molecular ions is an essential innovation because previous efforts have demonstrated that neutral molecules ricochet off ultracold atoms without sufficient heat transfer. "When a molecular ion and a neutral atom get close together they get in tight and bang off each other a bunch before the ion goes away," Hudson said. "When they collide like that it is very easy for the energy in one to go to the other." While magneto-optical and ion traps are not new to the world of molecular physics, Hudson and his colleagues became the first group to combine these methods to create a cloud of ultracold molecules. This paper is the result of over four years of work spent designing, building, and testing their experiment. "These two different technologies earned Nobel prizes for the scientists who developed them, but there wasn't really a body of knowledge about how to put these two procedures together," Hudson said. " If you liked this article, please give it a quick review on reddit or StumbleUpon. Thanks! Check out these other trending stories on Nanowerk:
<urn:uuid:1d20abbc-3944-44a5-9138-a8ebdf0cd6f1>
CC-MAIN-2014-49
http://www.nanowerk.com/news2/newsid=29755.php
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416931007510.17/warc/CC-MAIN-20141125155647-00128-ip-10-235-23-156.ec2.internal.warc.gz
en
0.938608
1,137
4
4
You may have a $10,000 Sub-Zero fridge in your kitchen, but this is cooler. Theoretical physicists have dreamed up a scheme to make a refrigerator out of a pair of quantum particles such as ions or atoms, or even a single particle. The fridges may be the smallest ones possible. “It’s very elegant and innovative,” says Nicolas Gisin, a theorist at the University of Geneva in Switzerland. Theo Nieuwenhuizen, a theorist at the University of Amsterdam, says “I don’t see any error, so probably this would work.” The challenge is to make a few quantum particles act like a so-called thermal machine, the theory of which was set out by French engineer Sadi Carnot in 1824. Carnot imagined a piston filled with gas that could be compressed or expanded. The piston could make contact with either of two large bodies (say, massive steel blocks) at different temperatures, which could serve as the “hot bath” and the “cold bath.” Carnot put the imaginary piston through a cycle of motions, including one in which the gas expands while in contact with the hot bath and another in which it is compressed while in contact with the cold bath. During the cycle, the piston does work while absorbing heat from the hot bath and releasing heat into the cold one, making it a “heat engine.” Reverse the cycle and, in response to work done on it, the piston acts as a refrigerator, absorbing heat from the cold bath and releasing it into the hot one. Now, Noah Linden, Sandu Popescu, and Paul Skrzypczyk of the University of Bristol in the United Kingdom report that, at least in principle, they can make a refrigerator out of a few quantum particles called “qubits.” Each qubit has only two possible quantum states: a zero-energy ground state and a fixed-energy excited state. The theorists have found a way to siphon energy out of one qubit by making it interact with just two others. The theorists arrange things so that each qubit has a different excited-state energy but the trio of qubits has two configurations with the same total energy. One is the configuration in which only the first and third qubits are in their excited states—denoted (101). The other is the configuration in which only the second qubit is in its excited state—denoted (010). If all three qubits were at the same temperature, then the system would flip with equal probability back and forth between these two configurations. But the researchers skew that flipping, as they explain in a paper in press at Physical Review Letters. The trick is to put the first two qubits in contact with a cold bath and the third one in contact with a hot bath. The higher temperature makes it more likely that the third qubit will be in its excited state—and thus that the trio will be in the (101) state instead of the (010) state. But that means the system is more likely to flip out of (101) and into (010) than the other way around. So on average the flipping takes the first qubit from its excited state to its ground state and draws energy out of the first qubit. After a flip, the qubits essentially reset by interacting with the baths, allowing the cycle to start again. The theorists measure the fridge’s size in terms of the number of its quantum states, and the three qubits have a total of eight possible states. That number can be clipped to six, if they replace the second and third qubits with a single “qutrit,” a particle with a ground state and two excited states—although those two states have to be in contact with different baths. “We believe that’s probably the smallest number of states you can get away with,” Linden says. In theory, such a fridge can get arbitrarily close to absolute zero, and Popescu says that it might be possible to make one using trapped ions for the qubits and streams of laser light as the baths. Some researchers hope to use such qubits as the guts for a quantum computer, and Popescu says the refrigerator scheme might allow researchers to cool some set of qubits with a few others. David Wineland, an experimental physicist with the U.S. National Institute of Standards and Technology in Boulder, Colorado, says he believes such schemes can indeed be implemented in trapped ions. Others suggest that such tiny quantum refrigerators might already be humming along in nature. It’s possible that one part of a biomolecule might work to cool another in such a fashion, says Hans Briegel, a theorist at the University of Innsbruck in Austria. “I don’t expect that you will have a mechanism exactly like this,” Briegel says, “but it gives you a framework valuable for telling what to search for.” No word yet on when physicists might unveil the smallest possible beer.
<urn:uuid:4136625a-7681-4b2e-ba8a-f677d4c22b92>
CC-MAIN-2014-49
http://news.sciencemag.org/2010/08/quantum-physicists-dream-smallest-possible-refrigerator
s3://commoncrawl/crawl-data/CC-MAIN-2014-49/segments/1416400379512.32/warc/CC-MAIN-20141119123259-00193-ip-10-235-23-156.ec2.internal.warc.gz
en
0.952537
1,052
3.5625
4
Microwave photonics circuit elements will need to be similar to their RF analogs to provide the desired functionality. One of these analogous circuit elements is a terahertz microwave cavity resonator, which can be integrated onto an IC with standard CMOS processes. This is one of many circuit elements that can be placed on an IC and used to enable unique applications. These fibers will soon be integrated into semiconductor wafers as microwave lines to communicate with unique circuit elements like terahertz microcavity resonators. Microwave components have a lot more going on than what ends up in your microwave oven. Terahertz wave sources, detectors, and components have yet to be miniaturized, and the terahertz portion of the microwave spectrum is still largely unexplored. So far, the best we can do is get into the high GHz (low THz) region for oscillation, detection, and wave manipulation. This region is critical for many applications, including quantum computing, imaging, sensing, and ultra-fast communication. One fundamental set of components is terahertz microcavity resonators. These components are part of a larger photonics platform and they play analogous roles to RF resonators on a PCB. The simple geometry of these resonators also allows them to be placed on a chip alongside other photonic structures. If you’re a budding photonics engineer, keep reading to learn more about these resonator structures and how they might play a role in current and upcoming photonics systems. What Are Terahertz Microcavity Resonators? Much like any other resonator, terahertz microcavity resonators have a fundamental frequency that lies in the terahertz region. In terms of wavelength, a 1 THz wave in air has a wavelength of only 300 microns, which is quite large compared to today’s transistors. These structures provide the same function as well; they allow a wave matching the fundamental frequency or one of its harmonics to excite a high-Q resonance, whereby a standing wave can form in the cavity. Much like a wave on a string or in a waveguide, this standing wave at one of the eigenfrequencies will have very high intensity due to constructive interference inside the cavity. The very strong, very coherent electromagnetic wave in this structure can then be used for some other application. The challenges in working with these structures are wave generation and detection, both of which need to be solved for terahertz microcavity resonators to be useful at the chip level. Geometry and Eigenfrequencies The image below shows a simple rectangular terahertz microcavity resonator and its discrete eigenfrequency spectrum. The eigenfrequencies can be tuned to desired values by adjusting the geometry, just like any other resonator. The equation below applies to a closed rectangular cavity and provides a good first approximation for a slightly lossy cavity (i.e., with high dielectric constant contrast at the edge). Rectangular terahertz microcavity resonator geometry and eigenfrequencies. Although a rectangular geometry is shown above, more complex structures may be used for different applications. In a different structure (e.g., circular, hemispherical, or cylindrical) with an open edge, the eigenfrequencies may not obey such a simple equation. Instead, they may be determined from a dispersion relation that is a transcendental equation, which requires a numerical technique to extract specific frequencies. This is a well-known procedure for solving Sturm-Liouville problems in waveguides and resonators. If you have a much more complex structure that can’t be approximated as a simple shape, the various eigenfrequencies and the spatial distribution of the electromagnetic field can be determined using a 3D field solver (FDFD technique). A field solver you would normally use for IC packages can also be used for modeling terahertz microcavity resonators. Applications for terahertz microcavity resonators are still being researched, as are the device architectures required for different applications. Some proposed applications of terahertz microcavity resonators include: Sensing and imaging: High-Q terahertz microcavity resonators can be used for highly coherent imaging and sensing, with applications in molecular detection and biological imaging. Silicon photonics: While this application area is normally discussed in terms of SMF or MMF wavelengths, devices in this area can also operate at THz frequencies and will need terahertz microcavity resonators to act as filters and amplifiers. Communication: Currently, the world record for the highest data rate transmission belongs to an experimental wireless system operating at THz frequencies. Miniaturizing these systems at the chip level will require microcavity structures, including terahertz microcavity resonators. The important advancement provided by these structures is that they can occur on an integrated circuit. Today, these applications still involve large optical systems where an infrared mode comb in a femtosecond soliton laser is used to generate a terahertz wave through interference. Similarly, large systems are also used for the detection and manipulation of terahertz waves. Terahertz microcavity resonators are one class of components that can provide high-Q or low-Q reception of THz frequencies, which can then be passed to a detector element or other photonic circuit. The range of useful materials for building terahertz microcavity resonators, or for building coupling structures, is also an open research question. Some material platforms used for terahertz microcavity resonators include: Silicon: This material is the most promising for the fabrication of terahertz devices and their integration alongside other electronic circuits. GaAs, other III-V’s, and II-VI’s: This promising set of photonic materials has already shown interesting results at ~3 THz frequencies, particularly for the generation of laser light. This material platform is promising for photonics in general. Photonic crystals: Periodic nanostructures that are fabricated through chemical deposition methods provide a tunable platform for fabricating a range of terahertz devices, including terahertz microcavity resonators. Dielectrics: This broad range of materials includes oxides, salts, polymers, and other materials that can support transmission or absorption in various THz frequency ranges. For integration, the best set of materials should bond to the industry’s current range of semiconductors. Microcavity resonator materials should be chosen to integrate into existing semiconductor materials platforms and manufacturing processes. As your technology and designs push into more advanced spaces with the years to come, more advanced software that can navigate the nuances and challenges of THz components will be necessary. Be sure to prepare adequately as you stay ahead of the frequency curve. About the AuthorFollow on Linkedin Visit Website More Content by Cadence PCB Solutions
<urn:uuid:f79efbdf-e963-4c80-877f-28d2bc7015b6>
CC-MAIN-2021-43
https://resources.pcb.cadence.com/blog/2020-todays-and-tomorrows-terahertz-microcavity-resonators
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323585696.21/warc/CC-MAIN-20211023130922-20211023160922-00156.warc.gz
en
0.888988
1,485
3.78125
4
First Teleportation Between Distant Atoms For the first time, scientists have successfully teleported information between two separate atoms in unconnected enclosures a meter apart – a significant milestone in the global quest for practical quantum information processing. Teleportation may be nature’s most mysterious form of transport: Quantum information, such as the spin of a particle or the polarization of a photon, is transferred from one place to another, but without traveling through any physical medium. It has previously been achieved between photons over very large distances, between photons and ensembles of atoms, and between two nearby atoms through the intermediary action of a third. None of those, however, provides a feasible means of holding and managing quantum information over long distances. Now a team from the Joint Quantum Institute (JQI) at the University of Maryland (UMD) and the University of Michigan has succeeded in teleporting a quantum state directly from one atom to another over a substantial distance (see reference publication). That capability is necessary for workable quantum information systems because they will require memory storage at both the sending and receiving ends of the transmission. In the Jan. 23 issue of the journal Science, the scientists report that, by using their protocol, atom-to-atom teleported information can be recovered with perfect accuracy about 90% of the time – and that figure can be improved. “Our system has the potential to form the basis for a large-scale ‘quantum repeater’ that can network quantum memories over vast distances,” says group leader Christopher Monroe of JQI and UMD. “Moreover, our methods can be used in conjunction with quantum bit operations to create a key component needed for quantum computation.” A quantum computer could perform certain tasks, such as encryption-related calculations and searches of giant databases, considerably faster than conventional machines. The effort to devise a working model is a matter of intense interest worldwide. Teleportation works because of a remarkable quantum phenomenon called entanglement which only occurs on the atomic and subatomic scale. Once two objects are put in an entangled state, their properties are inextricably entwined. Although those properties are inherently unknowable until a measurement is made, measuring either one of the objects instantly determines the characteristics of the other, no matter how far apart they are. The JQI team set out to entangle the quantum states of two individual ytterbium ions so that information embodied in the condition of one could be teleported to the other. Each ion was isolated in a separate high-vacuum trap, suspended in an invisible cage of electromagnetic fields and surrounded by metal electrodes. [See illustration above.] The researchers identified two readily discernible ground (lowest energy) states of the ions that would serve as the alternative “bit” values of an atomic quantum bit, or qubit. Conventional electronic bits (short for binary digits), such as those in a personal computer, are always in one of two states: off or on, 0 or 1, high or low voltage, etc. Quantum bits, how-ever, can be in some combination, called a “superposition,” of both states at the same time, like a coin that is simultaneously heads and tails – until a measurement is made. It is this phenomenon that gives quantum computation its extraordinary power. At the start of the experimental process, each ion (designated A and B) is initialized in a given ground state. Then ion A is irradiated with a specially tailored microwave burst from one of its cage electrodes, placing the ion in some desired superposition of the two qubit states – in effect “writing” into “memory” the information to be teleported. Immediately thereafter, both ions are excited by a picosecond (one trillionth of a second) laser pulse. The pulse duration is so short that each ion emits only a single photon as it sheds the energy gained by the laser and falls back to one or the other of the two qubit ground states. Depending on which one it falls into, the ion emits one of two kinds of photons of slightly different wavelengths (designated red and blue) that correspond to the two atomic qubit states. It is the relationship between those photons that will eventually provide the telltale signal that entanglement has occurred. Each emitted photon is captured by a lens, routed to a separate strand of fiber-optic cable, and carried to a 50-50 beamsplitter where it is equally probable for the photon to pass straight through the splitter or to be reflected. On either side of the beamsplitter are detectors that can record the arrival of a single photon. Before it reaches the beamsplitter, each photon is in an unknowable superposition of states. After encountering the beamsplitter, however, each takes on specific characteristics. As a result, for each pair of photons, four color combinations are possible – blue-blue, red-red, blue-red and red-blue – as well as one of two polarizations: horizontal or vertical. In nearly all of those variations, the photons either cancel each other out or both end up in the same detector. But there is one – and only one – combination in which both detectors will record a photon at exactly the same time. In that case, however, it is physically impossible to tell which ion produced which photon because it cannot be known whether a photon arriving at a detector passed through the beamsplitter or was reflected by it. Thanks to the peculiar laws of quantum mechanics, that inherent uncertainty projects the ions into an entangled state. That is, each ion is in a superposition of the two possible qubit states. The simultaneous detection of photons at the detectors does not occur often, so the laser stimulus and photon emission process has to be repeated many thousands of times per second. But when a photon appears in each detector, it is an unambiguous signature of entanglement between the ions. When an entangled condition is identified, the scientists immediately take a measurement of ion A. The act of measurement forces it out of superposition and into a definite condition: one of the two qubit states. But because ion A’s state is irreversibly tied to ion B’s, the measurement also forces B into the complementary state. Depending on which state ion A is found in, the researchers now know precisely what kind of microwave pulse to apply to ion B in order to recover the exact information that had been written to ion A by the original microwave burst. Doing so results in the accurate teleportation of the information. What distinguishes this outcome as teleportation is that no information pertaining to the original memory actually passes between ion A and ion B. The information disappears when ion A is measured and reappears when the microwave pulse is applied to Ion B. “One particularly attractive aspect of our method is that it combines the unique advantages of both photons and atoms,” says Monroe. “Photons are ideal for transferring information fast over long distances, whereas atoms offer a valuable medium for long-lived quantum memory. The combination represents an attractive architecture for a ‘quantum repeater,’ that would allow quantum information to be communicated over much larger distances than can be done with just photons. Also, the teleportation of quantum information in this way could form the basis of a new type of quantum internet that could outperform any conventional type of classical network for certain tasks.” The research was supported by the Intelligence Advanced Research Project Activity program under U.S. Army Research Office contract, the National Science Foundation (NSF) Physics at the Information Frontier Program, and the NSF Physics Frontier Center at JQI.
<urn:uuid:9d3bacdb-3c2c-45df-ba4e-0bf7800ca4c5>
CC-MAIN-2021-43
https://jqi.umd.edu/news/first-teleportation-between-distant-atoms
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323587719.64/warc/CC-MAIN-20211025154225-20211025184225-00636.warc.gz
en
0.933143
1,579
3.5625
4
Quantum computers are revolutionizing computers and are paving the way for innovations — for example, in medicine and the Internet of Things. Shohini Ghose explains what sets quantum computers apart. Shohini Ghose’s work begins where our understanding ends. As a physicist, she works in the field of quantum mechanics, which theorizes that there is a probability for a particle to be found in two different locations at a given time — something that seems actually unthinkable. “It is so exciting. We observe those microscopic particles indirectly and develop an explanation of this hidden quantum world,” says Ghose. While this might sound like science fiction, it has concrete real-world applications, especially given that quantum mechanics could revolutionize computers. Ghose explains what this means as follows: “Quantum computers are not just a faster version of our current computers. They operate on the laws of quantum physics. It’s just like a light bulb compared to a candle.” “Quantum computers can do computing tasks that are outside of the reach of even the best computers today.” One? Zero? All in between! Whereas conventional computers use bits as the smallest electronic storage unit, quantum computers use quantum bits — qubits for short. These go beyond the usual binary code of zeros and ones because they can take on any number of overlap states. In other words, it can be described as having a probability of being zero or one. This, which is commonly referred to as a “superposition” state, cannot be compared to anything from our everyday world, but can be easily explained with the following image: imagine a qubit as a sphere with the one at its north pole and the zero at its south pole. While a bit in a conventional computer is in a state of either zero or one, a qubit can take on any in-between state on the surface of the sphere. This superposition allows qubits to carry out parallel computing operations. “That means we can do computing tasks that are outside of the reach of even the best computers today. We can do calculations faster, and search faster through big data,” says Ghose. Artificial intelligence, which is designed to analyze huge amounts of data, could benefit from this, as could materials and pharmaceutical research. “Future large-scale quantum simulation could perhaps lead to treatments for diseases like Alzheimer’s,” suggests Ghose. In order for that to happen, atom structures need to be precisely analyzed, which is already difficult for mainframe computers that are currently employed by researchers. “Quantum is one way to really secure the Internet and the communication in the Internet of Things.” An end to all cyber-attacks? Quantum computers could also render communication more secure in the way information is “teleported”. There’s another term associated with sci-fi films. However, the phenomenon of “entanglement” lies behind quantum mechanics: two qubits are linked together in such a way that a change to one causes a change to its corresponding qubit. This occurs without time lags, over any distance, and of course without any physical connection such as cables or radio waves. Using this idea key codes for data transmission could be generated. The clever thing here is that the quantum state of the qubit changes with every unauthorized access — for example, an attack from a hacker. The communication partners would perceive this as a disturbance in their communication, would thus be warned and could use a new key. “Quantum is one way to really secure the Internet and the communication in the Internet of Things”, says Ghose, who works with her team on encryption protocols of this nature. is the temperature to which quantum computers must be cooled in order for the qubits to operate reliably. Why it is important to talk about quantum computers The immense power of quantum computers also raises ethical questions. On the one hand, they currently consume a great deal of electricity because their chips have to be laboriously cooled down with liquid helium to -273.13° Celsius in what are known as dilution refrigerators. On the other hand, there is a risk that this technology could fall into the wrong hands — should criminals succeed in building a quantum computer, they could use it for the purpose of launching cyber-attacks. They would then be able to crack all data that is encrypted on the basis of conventional computers. Therefore, Ghose is advocating for a social discussion about quantum computers: “I hope that we can address this before the technology is rolled out rather than to catch up and to regulate and control later.” Ghose is convinced that this would allow the enormous potential of the quantum revolution to be put on the right track. An interview with Dr. Shohini Ghose, professor of quantum physics Dr. Shohini Ghose Professor of quantum physics and computer science at Wilfrid Laurier University in Waterloo, Canada Quantum offers a way to encrypt information that can never be hacked, no matter how good the hackers are. Shohini Ghose grew up in India and later studied physics and mathematics at the University of Miami and the University of New Mexico, USA. In 2003 she was a postdoctoral student at the University of Calgary in Canada and one year later became a professor at Wilfrid Laurier University. Together with her colleague Paul Jessen’s team from the University of Arizona, Ghose was the first to show that there is a connection between chaos theory and quantum entanglement in cesium atoms. She is also the founder and director of the Laurier Centre for Women in Science and is the president of the Canadian Association of Physicists. Quantum computing makes use of what are referred to as quantum bits, making them more powerful than conventional computers. Among other things, this enables very secure encryption techniques for data transmission on the Internet, says Shohini Ghose. Nevertheless, she warns that the large computing capacity of quantum computers also raises ethical questions that urgently need to be discussed.
<urn:uuid:560295bb-d300-4306-93ae-705d5fa4f475>
CC-MAIN-2021-43
https://www.bosch.com/stories/future-of-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323585209.43/warc/CC-MAIN-20211018190451-20211018220451-00077.warc.gz
en
0.952989
1,249
3.953125
4
Artificial general intelligence (AGI) also known as strong AI is the mimic of generalized human cognitive abilities. AGI has the thinking and acting capabilities of human beings. So it can think, perform as human beings. It is the application of emergent behavior that ensures reinforced learning. Strong Artificial intelligence describes a mindset of AI development within a game environment. Moreover, it is indistinguishable from the human brain. Here we will discuss details of AGL and the best possible artificial general intelligence Examples. AGI Meaning: What is the best definition of strong AI? Deep AI or Artificial general intelligence (AGI) refers to the hypothetical ability of an intelligent agent that can perform any intellectual task like humans. It is the representation of human cognitive performance through machines. It consists of comprehensive knowledge and cognitive computing capabilities of the human brain. Sometimes, this type of artificial intelligence is beyond human capacities to process vast amounts of data. Artificial General Intelligence (AGI) also refers to general intelligent action, full AI, and deep AI. AGI is a continuous process that discusses science fiction and future studies. We can compare it with a child because it focuses on learning through experience. It constructs mental abilities through different functions and processes impersonated from the human brain. It needs continuous processing, developing, and upgrading to reach the optimum perfection level. We can see science fiction movies regarding the application of artificial general intelligence. Sometimes we found it in games. AGI is independent and can adapt to new situations. It is a crucial part of the AI revolution. The theory of Artificial General Intelligence (AGI ) rests upon complex machine systems that study neural networks. It is the system’s actual ability capable of solving complex situations by trial and error systems. Many AI experts think that AGI does not exist, but some believe in AGI. Why Do We Call Strong AI as General AI? Strong general intelligence is capable of doing many complex tasks. As a result, maximum intelligence functions can be solved by this AI. So we can generalize all the requirements if we can implement AGI. As a result, we can consider it as general AI. Tests of AGI So far, we have found two tests of deep AI. The first one is the Turing Test, developed by Alan Turing in 1950. He discussed it in the paper “Computing Machinery and Intelligence”. The second one is the Chinese Room Argument (CRA). John Searle discussed it in 1980. Strong AI vs. weak AI Strong AI is the intimation of human intelligence. We can compare it to a childhood brain. A brain develops from childhood to adulthood. Similarly, this AI develops in the process of the learning experience. Weak AI or narrow AI is the opposite of it. Narrow AI has limited memory and does not have any experience. AGI is dedicated to all sorts of tasks. On the other hand, weak AI is capable of solving particular problems. Artificial general intelligence performs a variety of functions and solves problems from the scenario. It does not rely on human interference. In the country, weak AI depends on human interference. Fuzzy logic is the example of AGI, and Self-driving cars and virtual assistants are the examples of weak AI. Why is Deep AI so powerful? AI bridges the gap between data science and its execution. This emerging technology has become an essential part of our daily life. Moreover, it is consists of big data, machine learning, neural network, and deep learning. Each moment it gathers experience from previous learning. Sometimes, it performs based on the situation. All of those elements make IA more powerful. What is Artificial General Intelligence Examples? The application of AGI is in the developing stage. It simplifies the task and gives results in an accurate and faster way. Moreover, we can use it for precise predictions, decisions making, and accurate analyses. However, here are some possible examples of strong artificial intelligence. 1. Manufacturing Robots AI-based robot control solutions can automate manual workstations. We can implement manufacturing robots in plugging cables, assembling products, Picking parts, tracking contours, etc. A single-camera can assist this project. 2. Self-driving cars Without human drivers, self-driving cars can drive up to their destination. It uses sensors, cameras, radar, and artificial intelligence (AI) to run smoothly. nuTonomy, AutoX, Drive.ai, Optimus Ride, Waymo, Zoox, and Tesla are examples of AI-enabled solid self-driven cars. 3. Smart Assistants AI assistants are a combination of microphones, computer chips, and AI software. It takes input and process with intelligence. Alexa, Siri, Nina, Viv Google Assistant, etc. are the example of Smart assistants. 4. Proactive healthcare management We can see countless applications in healthcare. AI can contribute EKGs, genomics, radiology images, blood tests, and managing patient medical history. It eliminates the chance of human error. 5. Disease Mapping We can see the uses of STRONG AI in mapping diseases. For any infectious disease, it can be the best example. During the covid situation, we can get the best utilization of AGI. 6. Automated Financial Investing The digital platform sets a pre-plan of the investment policy, customer information, and trading-based algorithm. Underwriters take decisions from the credit system. The application of Artificial general intelligence (AGI) can make confident decisions. 7. Virtual Travel Booking Agent The travel industry is booming after the use of artificial general intelligence. It is highly suitable for flight booking and accommodation. Intelligent chatbots and AI assistants are the key factors motivating tourists in the tourism industry. 8. Social Media Monitoring AI is also important for social media monitoring. It can deliver insight regarding social media profiles and brand insight. AI responds to customers based on their preferences. Example: EThe Facebook chatbot is an example of AI Social media monitoring. 9. Conversational marketing bot A set of AI technology is used for automated speech recognition and human-like interaction. It serves 24/7 and reduces the cost of stuffing. Almost all business organizations are using the Conversational marketing AI bot. 10. Natural Language Processing (NLP) Tools Natural Language Processing (NLP) is a branch of Artificial Intelligence (AI) that understands natural language and acts accordingly. Translation, spell check, or topic classification are some examples of Natural Language Processing (NLP) tools. 11. Fraud News Detection Deep AI can understand the insight of massage. So, it can detect the fraud news quickly. It helps a lot for misleading people. What Can Artificial General Intelligence Do? Artificial general intelligence is known as deep AI. This concept mimics the human brain, understanding, thinking, and applying solutions in real life. Its theory of mind part explains the needs, beliefs, emotions, and thought of human psychology. It can perceive the environment and solve any real-life problem like humans. Requirements of Artificial General Intelligence? We already study narrow or weak AI. But, AGI is different from narrow AI. Similarly, the requirement for strong AI is also different. Here is the requirement of deep AI: - Application of common sense - Background knowledge - Capacity to learn machine learning and deep learning algorithms - Good knowledge of Statistics and modeling - Transfer Learning and Abstraction - Knowledge of the different programming languages. - algorithms writing capabilities to find patterns and learning How far are we from artificial general intelligence? Everything is possible in theory. But, in real life, it isn’t effortless. Till now, we are at the age of narrow AI. The scientist is trying, but it is far away. Quantum computing may be the gateway of artificial general intelligence. With the advancement of technology, we expect to get the blessing of deep AI within a few decades. Experts expected to offer the beta version of AGI by around 2030. However, based on the ongoing research, we can expect artificial general intelligence within 2060. Does Deep AI happen? Is AGI possible? We can assume that AGI is possible because of the cognitive stimulation of the brain. But there is a controversial issue of duplication of human intellectual abilities. However, doubt may arise because of the lack of substantial progress of deep AI. Moreover, there is no real definition of AGI relating to the human brain. Finally, the question is whether artificial general intelligence is possible or not. Artificial General Intelligence or deep AI is a mimic similar to the human brain. We can compare it to the brain of a child. Every moment it is learning. We can see the super fiction movies of AGI. But, in the real world, it may not be possible. Scientists are trying to implement Strong AI into different solutions. We hope to see artificial intelligence within the next decade.
<urn:uuid:38948aa0-5b08-4af0-ab43-b739c2345e4f>
CC-MAIN-2021-43
https://www.fossguru.com/artificial-general-intelligence-agi-the-best-strong-ai-examples/
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323585183.47/warc/CC-MAIN-20211017210244-20211018000244-00117.warc.gz
en
0.911651
1,834
3.6875
4
Complex 3D nanoscale architectures based on DNA self-assembly can conduct electricity without resistance and may provide a platform for fabricating quantum computing and sensing devices Three-dimensional (3-D) nanostructured materials — those with complex shapes at a size scale of billionths of a meter — that can conduct electricity without resistance could be used in a range of quantum devices. For example, such 3-D superconducting nanostructures could find application in signal amplifiers to enhance the speed and accuracy of quantum computers and ultrasensitive magnetic field sensors for medical imaging and subsurface geology mapping. However, traditional fabrication tools such as lithography have been limited to 1-D and 2-D nanostructures like superconducting wires and thin films. Now, scientists from the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, Columbia University, and Bar-Ilan University in Israel have developed a platform for making 3-D superconducting nano-architectures with a prescribed organization. As reported in the November 10, 2020, issue of Nature Communications, this platform is based on the self-assembly of DNA into desired 3-D shapes at the nanoscale. In DNA self-assembly, a single long strand of DNA is folded by shorter complementary “staple” strands at specific locations — similar to origami, the Japanese art of paper folding. “Because of its structural programmability, DNA can provide an assembly platform for building designed nanostructures,” said co-corresponding author Oleg Gang, leader of the Soft and Bio Nanomaterials Group at Brookhaven Lab’s Center for Functional Nanomaterials (CFN) and a professor of chemical engineering and of applied physics and materials science at Columbia Engineering. “However, the fragility of DNA makes it seem unsuitable for functional device fabrication and nanomanufacturing that requires inorganic materials. In this study, we showed how DNA can serve as a scaffold for building 3-D nanoscale architectures that can be fully “converted” into inorganic materials like superconductors.” To make the scaffold, the Brookhaven and Columbia Engineering scientists first designed octahedral-shaped DNA origami “frames.” Aaron Michelson, Gang’s graduate student, applied a DNA-programmable strategy so that these frames would assemble into desired lattices. Then, he used a chemistry technique to coat the DNA lattices with silicon dioxide (silica), solidifying the originally soft constructions, which required a liquid environment to preserve their structure. The team tailored the fabrication process so the structures were true to their design, as confirmed by imaging at the CFN Electron Microscopy Facility and small-angle x-ray scattering at the Complex Materials Scattering beamline of Brookhaven’s National Synchrotron Light Source II (NSLS-II). These experiments demonstrated that the structural integrity was preserved after they coated the DNA lattices. “In its original form, DNA is completely unusable for processing with conventional nanotechnology methods,” said Gang. “But once we coat the DNA with silica, we have a mechanically robust 3-D architecture that we can deposit inorganic materials on using these methods. This is analogous to traditional nanomanufacturing, in which valuable materials are deposited onto flat substrates, typically silicon, to add functionality.” The team shipped the silica-coated DNA lattices from the CFN to Bar-Ilan’s Institute of Superconductivity, which is headed by Yosi Yeshurun. Gang and Yeshurun became acquainted a couple years ago, when Gang delivered a seminar on his DNA assembly research. Yeshurun — who over the past decade has been studying the properties of superconductivity at the nanoscale — thought that Gang’s DNA-based approach could provide a solution to a problem he was trying to solve: How can we fabricate superconducting nanoscale structures in three dimensions? “Previously, making 3-D nanosuperconductors involved a very elaborate and difficult process using conventional fabrication techniques,” said Yeshurun, co-corresponding author. “Here, we found a relatively simple way using Oleg’s DNA structures.” At the Institute of Superconductivity, Yeshurun’s graduate student Lior Shani evaporated a low-temperature superconductor (niobium) onto a silicon chip containing a small sample of the lattices. The evaporation rate and silicon substrate temperature had to be carefully controlled so that niobium coated the sample but did not penetrate all the way through. If that happened, a short could occur between the electrodes used for the electronic transport measurements. “We cut a special channel in the substrate to ensure that the current would only go through the sample itself,” explained Yeshurun. The measurements revealed a 3-D array of Josephson junctions, or thin nonsuperconducting barriers through which superconducting current tunnels. Arrays of Josephson junctions are key to leveraging quantum phenomena in practical technologies, such as superconducting quantum interference devices for magnetic field sensing. In 3-D, more junctions can be packed into a small volume, increasing device power. “DNA origami has been producing beautiful and ornate 3-D nanoscale structures for almost 15 years, but DNA itself is not necessarily a useful functional material,” said Evan Runnerstrom, program manager for materials design at the U.S. Army Combat Capabilities Development Command Army Research Laboratory of the U.S. Army Research Office, which funded the work in part. “What Prof. Gang has shown here is that you can leverage DNA origami as a template to create useful 3-D nanostructures of functional materials, like superconducting niobium. This ability to arbitrarily design and fabricate complex 3-D-structured functional materials from the bottom-up will accelerate the Army’s modernization efforts in areas like sensing, optics, and quantum computing.” “We demonstrated a pathway for how complex DNA organizations can be used to create highly nanostructured 3-D superconducting materials,” said Gang. “This material conversion pathway gives us an ability to make a variety of systems with interesting properties — not only superconductivity but also other electronic, mechanical, optical, and catalytic properties. We can envision it as a “molecular lithography,” where the power of DNA programmability is transferred to 3-D inorganic nanofabrication.” Reference: “DNA-assembled superconducting 3D nanoscale architectures” by Lior Shani, Aaron N. Michelson, Brian Minevich, Yafit Fleger, Michael Stern, Avner Shaulov, Yosef Yeshurun and Oleg Gang, 10 November 2020, Nature Communications. This research was supported by the U.S. Department of Defense, Army Research Office; DOE Office of Science; Israeli Ministry of Science and Technology; and Israel Science Foundation. Both CFN and NSLS-II are DOE Office of Science User Facilities. Some imaging studies were carried out at the Imaging Facility of the City University of New York Advanced Science Research Center.
<urn:uuid:137c7cd3-4db3-49d2-bdfb-5f4f190594b1>
CC-MAIN-2021-43
https://scitechdaily.com/making-3d-superconducting-nanostructures-with-dna/?utm_source=TrendMD&utm_medium=cpc&utm_campaign=SciTechDaily_TrendMD_0
s3://commoncrawl/crawl-data/CC-MAIN-2021-43/segments/1634323583083.92/warc/CC-MAIN-20211015192439-20211015222439-00639.warc.gz
en
0.924255
1,536
3.625
4