text
stringlengths
4.06k
10.7k
id
stringlengths
47
47
dump
stringclasses
20 values
url
stringlengths
26
321
file_path
stringlengths
125
142
language
stringclasses
1 value
language_score
float64
0.71
0.98
token_count
int64
1.02k
2.05k
score
float64
3.5
4.53
int_score
int64
4
5
Focus: Electron Spin Influences Nanotube Motion The spin of an electron often occupies a reality all its own, with little bearing on the electron’s overall motion or the motion of nearby atoms. But theoretical work reported in Physical Review Letters demonstrates that the spin of a single electron trapped on a carbon nanotube may influence—and be influenced by—the vibrations of the nanotube. The researchers behind the work foresee this spin-mechanical combination having a role in nanoscale mass sensors or in information processing elements of a quantum computer. Electron spin sometimes makes its presence known in subtle ways. A single energy level in an atom can become two closely spaced levels because of so-called spin-orbit coupling. In this effect, the motion of the electron around the nucleus creates an effective magnetic field that causes spin-up electrons to have a slightly different energy from spin-down electrons. A similar sort of spin-orbit coupling was recently discovered in carbon nanotubes . The delocalized electrons, those not associated with specific atoms, follow circular orbits around the tube circumference. As in an atom, this motion causes one orientation of the electron spin to have lower energy than the opposite orientation. This relationship between the electron spin and the cylindrical geometry of a nanotube means that motion of the tube could alter the spin, and vice versa. András Pályi of Eötvös University in Budapest and his colleagues have now proposed an experiment to demonstrate this connection. The proposal is based on recent work examining the relationship between nanotube motion and electric current . In their model, the team imagines a carbon nanotube suspended between two leads about a half micron apart, with a single electron trapped on this “tightrope.” An externally applied magnetic field pointing along the nanotube axis splits the ground state of the electron into two energy levels, corresponding to the electron spin being parallel and antiparallel to the magnetic field. As in earlier work, vibrations in the nanotube can be excited by radio waves tuned to one of the resonant frequencies. The changes in the shape of the nanotube alter the orbital path of the trapped electron, and because of the strong spin-orbit coupling, the electron’s spin can switch direction . In order to maximize the effect on the spin, the theorists found that the magnetic field strength must be set so that the energy difference between the two spin states matches the energy of the nanotube vibration. Pályi and his colleagues found that the system mimics the well-studied case of an atom in an optical cavity, where the atom can only emit or absorb light for which an integer number of half-wavelengths matches the cavity’s length. Similarly, in the sound wave (or phonon) cavity of the stretched nanotube, transitions between the two spin states are driven by nanotube vibrations. And the influence goes both ways, as numerical calculations showed that the coupling to a single electron spin shifts the frequency at which the nanotube vibrates. The team says that adding such a spin-dependence could improve the sensitivity of nanotube-based sensors that can already measure the mass of a handful of atoms. In quantum computing, the oscillations of the nanotube could be used to flip the value of a spin qubit or process it in some way. Since the nanotube motion can be driven by simple radio waves from a small antenna, this qubit control might be less challenging than other techniques that rely on rapidly varying magnetic fields, Pályi says. “This is a promising hybrid system, and experiments are making rapid progress, placing novel proposals like this in high demand,” says Steven Bennett from Harvard University. One challenge is measuring the vibrations in the nanotube, says Gary Steele from the Technical University of Delft in the Netherlands. Typically this tiny motion has been observed using currents through the nanotube, but the goal here is to keep electrons in place on the nanotube. Developing external detectors of nanotube motion is “a very challenging task,” Steele says, but one that he and others are working on right now. Michael Schirber is a freelance science writer in Lyon, France. - F. Kuemmeth, S. Ilani, D. C. Ralph, and P. L. McEuen, “Coupling of Spin and Orbital Motion of Electrons in Carbon Nanotubes,” Nature 452, 448 (2008) - G. A. Steele, A. K. Huttel, B. Witkamp, M. Poot, H. B. Meerwaldt, L. P. Kouwenhoven, and H. S. J. van der Zant, “Strong Coupling Between Single-Electron Tunneling and Nanomechanical Motion,” Science 325, 1103 (2009) - D. V. Bulaev, B. Trauzettel, and D. Loss, “Spin-Orbit Interaction and Anomalous Spin Relaxation in Carbon Nanotube Quantum Dots,” Phys. Rev. B 77, 235301 (2008)
<urn:uuid:8b882637-d717-4c37-8dfe-4f5c201f8e67>
CC-MAIN-2015-32
http://physics.aps.org/articles/v5/57
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042987775.70/warc/CC-MAIN-20150728002307-00326-ip-10-236-191-2.ec2.internal.warc.gz
en
0.907871
1,089
3.546875
4
Jonathan Santiago, of MIT, describes his experience working with GreenFab and teaching advanced technical concepts to students in the Bronx. Here, A student experiments with using an Arduino microcontroller to power LEDs. Credit: Jonathan Santiago, STEM2GETHER This Behind the Scenes article was provided to LiveScience in partnership with the National Science Foundation. Growing up I was always very good at using computers, but I never really understood how they worked. Until I began my undergraduate studies at MIT, I never made the connection between what happens inside a desktop computer and what happens inside other everyday electronic devices. I didn't know what micro-controllers were, or how they've been used in TV remotes, MP3 players, cell phones, space shuttles, medical devices, and of course, personal computers. Now I help teach embedded electronics, also known as physical computing, and digital fabrication to high school students in the South Bronx section of New York City. Going to school in the country's poorest congressional district, the students who participate in the NSF funded GreenFab program have had no shortage of obstacles to their academic success. Despite these challenges, we don't doubt for a moment that we can teach our students advanced technical concepts, and inspire many of them to pursue careers in green technology and engineering. Although I now have a better understanding of embedded processing and enjoy teaching physical computing and digital fabrication to our students, I still have vivid memories of what it felt like to be completely in the dark. The late science fiction author Arthur C. Clarke once said that, "Any sufficiently advanced technology is indistinguishable from magic." If that's the case, then I felt during my first year at MIT that the engineers and scientists I met were indistinguishable from sorcerers. Some of the projects that were underway in Neil Gershenfeld's Center for Bits and Atoms research group (also NSF funded) included NMR quantum computing, inertial measurement devices, liquid computers, new internet protocols for household objects, and the creation of low-cost digital fabrication laboratories around the world. Involvement with the latter project, called FabLabs, is what led me to work for Sustainable South Bronx and Vision Education & Media to teach kids. FabLabs began as an outreach project from the Center for Bits and Atoms (CBA) group. They have a mission to provide widespread access to Computer Numerically Controlled (CNC) fabrication equipment and other modern tools for invention. An international network of FabLabs is currently evolving, with activities ranging from youth technology enrichmenment programs to the incubation of small-scale high-tech businesses. Working with FabLab tools during my undergraduate research at MIT was actually more helpful in demystifying how things are made and how things work than the introductory electrical engineering and computer science classes I took before I switched my major to mathematics. Those introductory classes focused heavily on first principles and abstract concepts, postponing hands on work until a theoretical framework was learned. FabLabs take the opposite approach. You learn concepts as they become necessary. FabLabs provide a great opportunity to make engineering and science hands-on for kids, rather than remote and abstract. Since the program began this past February, there is one student who stands out as an example. At first, Jose was somewhat of a challenging student to work with. Obviously a bright kid without a lot of energy, he wasn't always able to maintain focus and attention on any particular task. He saw everything as complicated and difficult, often giving up very early. Jose made a noticeable transformation during this past summer session, after building a DIY (do-it-yourself) robot project from scratch. I told Jose and a few other students to look up the Arduino SERB robot, which could be made from sratch using equipment and parts that we either had in the lab or could be easily obtained. Jose found a great tutorial on Instructables.com on how to put the robot together. With a minimal amount of supervision, Jose was able to finish the project and even add his own variation. He wanted the robot to be controlled by a Nintendo Wii 'Nunchuck' controller, which uses an accelerometer to control a gaming interface. Taking his own initiative, he researched how to 'hack' the Wii controller to have it interface with the SERB robot. Jose is currently a senior in high school and has expressed a strong desire to pursue an engineering degree at the New York State University at Buffalo, citing GreenFab as a motivating factor in his career ambitions. At the very least, we hope that when the students complete our program they will have acquired independent learning skills and a penchant for questioning how things work. This might lead to questioning how other complicated systems work, such as urban politics, development, and infrastructure. GreenFab could lead them to ask questions like, "Why are there less Green Spaces in the South Bronx than the West Village?," "Why does the city want to build more prisons and waste handling facilities in my neighborhood?," or "Saving polar bears is cool and everything, but can the 'green' movement actually help me earn a decent living?" The GreenFabWinter Project Exhibition will be December, 21 4:00 – 6:00 p.m. at 841 Barretto Street in the Bronx. - 10 Technologies That Will Transform Your Life - Electronics Breakthrough Could Revolutionize Memory Chips - For Young Brains, Teaching Technologies Are Hit-or-Miss Editor's Note: This research was supported by the National Science Foundation (NSF), the federal agency charged with funding basic research and education across all fields of science and engineering. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author and do not necessarily reflect the views of the National Science Foundation. See the Behind the Scenes Archive.
<urn:uuid:13786072-0287-42f0-8eea-dea1ac6bda79>
CC-MAIN-2015-32
http://www.livescience.com/9132-computer-scientists-bring-digital-world-bronx-kids.html
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042990123.20/warc/CC-MAIN-20150728002310-00206-ip-10-236-191-2.ec2.internal.warc.gz
en
0.960548
1,189
3.609375
4
How do you predict how a given quantum state (which always corresponds to a single point anywhere on or in the sphere) will react if subjected to a given quantum measurement (which always corresponds to a single axis)? To see how, draw a new line perpendicular to the measurement axis and passing through the point corresponding to the quantum state. The new line will divide the measurement axis into two segments whose lengths correspond to the probabilities of the two possible measurement results. An example: This figure shows a measurement that corresponds to tilting your polarizer (for example, sunglasses) by 55 degrees, and gives a 20% chance of measuring state (2) and an 80% chance of measuring state (1). This is now the third or fourth time we've encountered some type of measurement which gives random results. This is the first classical assumption that we have to let go: Classical Theory: God does not play dice with the universe. Quantum Theory: Quantum measurement can give random results. Are these results truly random, or did we just not know the answer before performing the measurement? This is very similar to asking the question, "If you measure the same qubit many times, do you get the same answer?" Answer: If you perform the same measurement, you always get the same result. Only the first measurement result is (potentially) random. Wait a minute. That doesn't seem right. Let's say I measure a photon several times in a row using the red, green, and blue axes (horizontal, diagonal, and right-circular polarizers) and every time the photon is transmitted. 100 percent horizontal, 100 percent diagonal, 100 percent right-circular? Remember that we are plotting each point using the answers to those three questions. Plotting 100 percent/100 percent/100 percent will give a point far outside the sphere, a state forbidden by quantum mechanics. Can we really never obtain these results, or is quantum mechanics wrong about the sphere? Neither. This is a false choice, because it's based on our second false assumption about classical information—an assumption that doesn't apply to qubits: Classical Theory: Reading the value of a bit doesn't change the bit's value. Quantum Theory: Measuring a qubit changes its value to match the result of the measurement. Using this newfound principle of quantum information, let's walk through an example. Let's start with a horizontally polarized photon, H. If we measure this photon using the H/V axis, it will stay horizontally polarized. If we then perform a measurement on the green axis, it is randomly transformed into either D or A (we'll choose D for this example). Finally, we repeat our first measurement using the red axis. Instead of giving the same result as our first measurement, however, there's now a 50 percent chance that our horizontal photon will be measured as vertical! The intervening measurement has changed the state of the qubit. (It's worth noting here that measurement is a real process, the same process that polarizing sunglasses and 3D lenses perform. You can test all of these examples with just a few pairs of eyewear.) Measurement appears, strangely, to be one way we can change the state of a qubit. For a quantum programmer wanting to adjust the qubits in a quantum computer, however, this may not be a good choice. After all, the results are random! Although some exotic quantum algorithms use measurement in the middle of a computation, most of the time measurement is reserved for the end, when the programmer learns the result of the computation. How then, do we change qubits without introducing randomness? Physically, every species of qubit (photon, electron, ion, etc.) is changed differently. Photon polarization can be changed by directing a photon through quartz crystals or Scotch tape. All of these processes, regardless of the species of qubit or the type of change, have a simple interpretation on the single-qubit sphere: they act as rotations. These rotations are defined by a single axis, just like measurements. But, instead of projecting all possible states into two possible outcomes, they rotate all states on an axis. Only the points on the axis of rotation will be unaffected. As an example, think of rotating the sphere by 90 degrees about the red axis. This kind of quantum operation leaves H and V unchanged, but transforms R > D, D > L, L > A, and A > R. We can now summarize the important characteristics of single qubits: - All single-qubit states correspond to a point on or inside a sphere. - Every axis corresponds to a single quantum measurement, and every measurement changes the state of the qubit to match the result of the measurement. - Qubits can be changed by rotating them around an axis. Although this succinctly describes the way in which a one-qubit quantum computer is supposed to work, what happens when things go wrong? For a classical bit, the only thing that can go wrong is for a bit to unexpectedly flip from zero to one or one to zero. The same type of thing could happen to qubits, in the form of unexpected or unwanted rotations. But there's another type of process, one that researchers in quantum computing are constantly fighting to eliminate: decoherence. Decoherence happens when something outside of the quantum computer performs a measurement on a qubit, the result of which we never learn. Let's say we measure the state H in the D/A axis (the green) axis. There's a 50% chance of measuring H in the state D and a 50% chance of measuring it in the state A. If we never learn which state the measurement resulted in, we'll have no idea how to predict the result of another measurement. This process is called decoherence, and, in fact, it's how states inside the sphere are created. By measuring along an axis but never learning the result, all points on the sphere collapse to the measurement axis. By partially measuring something (with say, really thin polarized sunglasses), we can collapse only part of the way: This sort of unwanted intrusion introduces randomness into a quantum computer. Because quantum bits can be single electrons, single ions, or single photons, all of which can be accidentally measured using a single stray atom, it can be exquisitely difficult to avoid decoherence. That's the primary reason that a 100-qubit quantum computer has not yet been built.
<urn:uuid:318648e6-17d7-4374-9cff-3de60d2c7189>
CC-MAIN-2015-32
http://arstechnica.com/science/2010/01/a-tale-of-two-qubits-how-quantum-computers-work/3/
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042990611.52/warc/CC-MAIN-20150728002310-00156-ip-10-236-191-2.ec2.internal.warc.gz
en
0.915761
1,338
3.5
4
Speed of light The speed of light in vacuum is held to be constant at 299,792,458 m/s (186,282.397 miles per second). Designated by the symbol "c" (for "constant"), it is a fundamental quantity of the universe. According to special relativity it is the universe's speed limit and it is part of the relation between mass and energy: Some have proposed that the speed of light has decayed since the Creation. While this theory opened the door to scientific solutions to the distant starlight problem, it is not generally accepted by creation scientists. One-Way Speed of Light Sagnac proved that light travels at different speeds depending on its direction and its proximity to the center of Earth's gravity, lending weight to the Anisotropic convention. The one-way speed of light has never been measured. Every known measurement of the speed of light includes reflecting it from another surface. This necessarily changes the nature of light, as it can only be the average of the outbound and inbound leg. Additionally, all electronic means to measure the speed of light cannot themselves operate at the speed of light. This introduces error and constraint into the measurement. If we attempt to embed a signal into a light beam to synchronize two clocks at a distance, the time it takes to both create and to interpret the signal introduce another constraint. In fact, any introduction of a measurement mechanism necessarily constrains the measurement because no measurement mechanism can operate at the speed of light. Einstein understood the primary paradox of the speed of light, as evidenced by the theory of black holes. A black hole's gravity is so strong that light cannot reach escape velocity. However, gravity can only act in this manner between bodies with mass, which necessarily means that photons have mass. Physicists generally do not accept the notion that photons have mass. If they do not, they would be able to escape a black hole, and it would not be black after all. However, if the photon has mass, then it is a particle with mass traveling at the speed of light. For such particles, time stands still. There is no duration between their departure (from an emitting source) and their destination. Essentially departure and arrival are instantaneous. If this is the case with a photon, then there is no such thing as a light-year in space, and the age of the Cosmos cannot be determined using light as a basis. Moreover, the speed of light is a function of distance and duration: speed = distance/time. However, Einstein asserted that time is relative. If this is true then the speed of light is also relative and cannot be constant. To resolve this paradox, Einstein side-stepped it by stipulating that the speed of light is constant without ever proving it. That light requires the same time to traverse the path A > M as for the path B > M is in reality neither a supposition nor a hypothesis about the physical nature of light, but a stipulation which I can make of my own freewill in order to arrive at a definition of simultaneity" (Einstein 1961, p. 23) [emphasis is in the original]. Whenever scientists encounter particle behaviors that defy the speed of light, such as the propensity of particles to instantly share behaviors even across vast distances (e.g. Quantum Entanglement) they still hold to the notion that the speed of light is constant, eliciting the strangest explanations, including the idea that all particles in the universe are connected to all other particles through wormholes. Such oddball theories are the simplest evidence that the "constant" speed of light has been accepted as a reality rather than a stipulation for mathematical purposes. Albert A. Michelson is credited with developing the method for the definitive measurement of the speed of light. In 1902 he published his classic paper on the speed of light, and in 1907 was awarded the Nobel Prize in Physics for this work. Michelson also proposed the standardization of the international unit of length, the meter, using specified wavelengths of light rather than an artifact. For decades the scientific community used Michelson's standardization method, but finally decided to define the SI unit of length according to the speed of light. Today one meter is defined as exactly 1/299,792,458 of the distance that a beam of light travels in one second. Many scientists in the past have speculated about possible changes in the values of one or more physical constants and its implications. These speculations were not always greeted with enthusiasm from the scientific community because the implications of any variation in any constant are enormous: it would introduce changes at astronomical levels in the very fiber of the Universe. Yet the idea never totally died out and was never totally suppressed. Glenn Morton was one of the first persons to put forth a concrete and testable model. He started not from changing fundamental constants, but from another angle. Soon Barry Setterfield came forward with his proposal of variation in the velocity of light. His initial proposal went through several revisions and modifications and creationist publications quoted him widely. Some secular publications also used the information, but the general response was to resist his proposals. Johnson C. Philip from India put forth the same idea in a broader way in 1982 and did some work with the Physics department of Jiwaji University in India. However, he had to abandon the work in 1984 due to the resistance of some non creationist professors. The proposal remains promising, and much work can be done. The resistance remains, especially from non creationists. However, the topic might find a revival, now that the secular community has started to consider the idea of changing fundamental constants. The speed of light has been used to calculate the distance of supernova 1987A from earth with great accuracy, based on observing the time taken for its light to illuminate the Large Magellanic Cloud. It is the standard method for calculating the distance to nearby galaxies. The part of the SN1987A ring perpendicular to the explosion center (as seen from us) was observed to light up about 8 months after the explosion. The light that took a detour via the ring to us was always a ring radius behind the direct light regardless of the speed of light that prevailed during the trip. The ring radius could be calculated to these 8 months times the speed of light as applied to the year 1987, when the measurement was made. Thus it is not of this observation to deduce if the light has had any different speed before 1987. The notion of c-decay is currently out of favor even among creationists. Two models for the creation of the universe, i.e. white hole cosmology and cosmological relativity, both assume a constant value of c. The Anisotropic Synchrony Convention holds for a variable value for c, and likewise provides for c to be relative to the speed of the emitting object. Anisotropism is the actual de-facto convention for Scripture, as God describes things from a human's-eye point-of-view. Even Christ said he would use earthly things to describe heavenly things. The human point-of-view is integrated to the Anisotropic convention, providing for the instantaneous arrival of distant starlight as well as explain local measurement in terms of time dilation. - Biography of Albert A. Michelson from the Nobel Committee - An Alternate View of SN1987A by Selva Harris. - Speed of light may have changed recently by Eugenie Samuel Reich, NewScientist.com
<urn:uuid:1ca82d5d-d8ea-444f-9dcd-05bcf1427395>
CC-MAIN-2015-32
http://creationwiki.org/Speed_of_light
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042990112.50/warc/CC-MAIN-20150728002310-00244-ip-10-236-191-2.ec2.internal.warc.gz
en
0.952808
1,537
4.34375
4
First Electronic Quantum Processor Created A team led by Yale University researchers has created the first rudimentary solid-state quantum processor, taking another step toward the ultimate dream of building a quantum computer. The two-qubit processor is the first solid-state quantum processor that resembles a conventional computer chip and is able to run simple algorithms. (Credit: Blake Johnson/Yale University) They also used the two-qubit superconducting chip to successfully run elementary algorithms, such as a simple search, demonstrating quantum information processing with a solid-state device for the first time. Their findings appeared in Nature's advanced online publication June 28. "Our processor can perform only a few very simple quantum tasks, which have been demonstrated before with single nuclei, atoms and photons," said Robert Schoelkopf, the William A. Norton Professor of Applied Physics & Physics at Yale. "But this is the first time they've been possible in an all-electronic device that looks and feels much more like a regular microprocessor." Working with a group of theoretical physicists led by Steven Girvin, the Eugene Higgins Professor of Physics & Applied Physics, the team manufactured two artificial atoms, or qubits ("quantum bits"). While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states. These states are akin to the "1" and "0" or "on" and "off" states of regular bits employed by conventional computers. Because of the counterintuitive laws of quantum mechanics, however, scientists can effectively place qubits in a "superposition" of multiple states at the same time, allowing for greater information storage and processing power. For example, imagine having four phone numbers, including one for a friend, but not knowing which number belonged to that friend. You would typically have to try two to three numbers before you dialed the right one. A quantum processor, on the other hand, can find the right number in only one try. "Instead of having to place a phone call to one number, then another number, you use quantum mechanics to speed up the process," Schoelkopf said. "It's like being able to place one phone call that simultaneously tests all four numbers, but only goes through to the right one." These sorts of computations, though simple, have not been possible using solid-state qubits until now in part because scientists could not get the qubits to last long enough. While the first qubits of a decade ago were able to maintain specific quantum states for about a nanosecond, Schoelkopf and his team are now able to maintain theirs for a microsecond—a thousand times longer, which is enough to run the simple algorithms. To perform their operations, the qubits communicate with one another using a "quantum bus"—photons that transmit information through wires connecting the qubits—previously developed by the Yale group. The key that made the two-qubit processor possible was getting the qubits to switch "on" and "off" abruptly, so that they exchanged information quickly and only when the researchers wanted them to, said Leonardo DiCarlo, a postdoctoral associate in applied physics at Yale's School of Engineering & Applied Science and lead author of the paper. Next, the team will work to increase the amount of time the qubits maintain their quantum states so they can run more complex algorithms. They will also work to connect more qubits to the quantum bus. The processing power increases exponentially with each qubit added, Schoelkopf said, so the potential for more advanced quantum computing is enormous. But he cautions it will still be some time before quantum computers are being used to solve complex problems. "We're still far away from building a practical quantum computer, but this is a major step forward." Authors of the paper include Leonardo DiCarlo, Jerry M. Chow, Lev S. Bishop, Blake Johnson, David Schuster, Luigi Frunzio, Steven Girvin and Robert Schoelkopf (all of Yale University), Jay M. Gambetta (University of Waterloo), Johannes Majer (Atominstitut der Österreichischen Universitäten) and Alexandre Blais (Université de Sherbrooke). Article source: ScienceDaily.com Jim Elvidge - Programmed Reality, The Power of 10, Science & The Soul Nick Begich - Mind Control & Emerging Technologies A short Introduction to Quantum Computation Is Quantum Mechanics Controlling Your Thoughts? How Time-Traveling Could Affect Quantum Computing Nano-Diamonds Might Lead to Quantum Computing 'Light trap' is a Step Towards Quantum Memory Latest News from our Front Page Residents of Dresden Wake up to Find Overnight, City Park Has Been Turned Into Migrant Camp For 2,000 The German city which dared to stand-up to their government’s policy of accepting Islamisation and mass migration appears to have been punished for dissent by the zero-notice imposition of a migrant camp. Government employees stood by the entrance of a city park in Dresden, Saxony on Thursday night handing out fliers to passers-by informing them the next day the green space, ... Dead LA man who had 1,200 guns, underwater car identified; believed to be 'part alien' secret government worker The mystery behind a Los Angeles gun fanatic found decomposing in a car last week has deepened as his fiancée's family said he was an alien-hybrid secretly working for the government. The bizarre statement came Wednesday as the betrothed woman's lawyer identified the dead man as Jeffrey Alan Lash — almost one week after he was discovered rotting in his car ... Britain Under Siege: Hundreds of Illegals Storm Eurotunnel Every Night The volume of illegals trying to gain access to the United Kingdom via the railway tunnel beneath the English Channel is now so great, hundreds storm the French terminal every night. Hoping to stow away on-board lorries or to ‘train surf’ to England, the migrants – of which there are an estimated 5,000 in the town of Calais at any one ... Detroit's black rape gangs target couples Victims forced into alleys, men made to watch sex assault on female companions In a developing story, a gang of rapists in Detroit is terrorizing citizens in the crime-ridden city. Detroit police are looking for as many as six suspects after two rape and robbery incidents occurred on Thursday night within hours of each other. The same group may be responsible ... Satanic statue unveiled in Detroit Christians protest after the Satanic Temple unveils bronze Baphomet statue featuring a human body, goat’s head and wings. Several hundred people have attended a Mass at a US Catholic church to protest against an eight and a half-foot (2.6-metre)-tall bronze statue of Satan that hundreds of people also lined up to see. The Satanic Temple had said it would unveil the ... |More News » |
<urn:uuid:0703df20-4ad6-4bc4-9108-906cc7d1b6b0>
CC-MAIN-2015-32
http://www.redicecreations.com/article.php?id=6996
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042982013.25/warc/CC-MAIN-20150728002302-00259-ip-10-236-191-2.ec2.internal.warc.gz
en
0.926425
1,465
3.921875
4
We have been delving into the dirty secret behind our food, which is that it comes from bacteria, primarily, with considerable assistance from a social network of fungi, nematodes, micro-arthropods and soil-dwelling microbes of various descriptions, many of which make the Star Wars café scene characters seem tame. Most people, asked what plants eat, answer something like, “sunlight, water and dirt.” Water and sunlight play an important role, for sure. Using the energy of photons from the sun, sugars and carbohydrates are constructed from carbon dioxide and water, discarding oxygen. But the real denizens of the deep are bacteria. Thanks to O2-generating bacteria at work for a billion years, Earth is now habitable for oxygen-loving creatures such as ourselves. In general terms, the strategy for solar energy utilization in all organisms that contain chlorophyll or bacteriochlorophyll is the same. Here is how some of our ancestors, the purple bacteria, do it: - Light energy is captured by pigment molecules in the light harvesting or "antenna" region of the photosystem, and is stored temporarily as an excited electronic state of the pigment. - Excited state energy is channeled to the reaction center region of the photosystem, a pigment-protein complex embedded in a charge-impermeable lipid bilayer membrane. - Arrival of the excited state energy at a particular bacteriochorophyll (BChl), or pair of BChls in the reaction center triggers a photochemical reaction that separates a positive and negative charge across the width of the membrane. - Charge separation initiates a series of electron transfer reactions that are coupled to the translocation of protons across the membrane, generating an electrochemical proton gradient [protonmotive force (pmf)] that can be used to power reactions such as the synthesis of ATP. If your eyes glazed over at that explanation, don’t worry. Much of photosynthesis still remains a mystery. Over the past several decades scientists examining oxygenic bacteria known as prochlorophytes (or oxychlorobacteria) have discovered a light harvesting protein complex. The intriguing thought arises, given how much of the bodies of plants are actually made up of bacteria (as also are our own), of whether photosynthesis is actually dependent on bacteria at one or more of the steps in the process. Recently Drs. Jianshu Cao, Robert Sibley and three MIT graduate students studied purple bacteria, one of the planet’s oldest species, and discovered a special symmetry. Ring-shaped molecules are arranged in a peculiarly faceted pattern on the spherical photosynthetic membrane of the bacterium. Dr. Cao says, “We believe that nature found the most robust structures in terms of energy transfer." Only a lattice made up of nine-fold symmetric complexes can tolerate an error in either direction. Spinning Photon Nets Another discovery (by Sabbert et al. in 1996) is that in order to optimize sunlight, the nine-fold symmetric lattice has to spin. Moreover, it has to spin quite fast — nearly 100 rpm. We know of some bacterial flagella that spin at high rpm. Might spinning flagella propel the photon-capturing process? Too soon to say, but its an intriguing idea, and yet more evidence for quantum entanglement of all life, big and small. The Encyclopedia of Applied Physics (1995) says: The amount of CO2 removed from the atmosphere each year by oxygenic photosynthetic organisms is massive. It is estimated that photosynthetic organisms remove 100 x 1015 grams of carbon (C)/year. This is equivalent to 4 x 1018 kJ of free energy stored in reduced carbon, which is roughly 0.1% of the incident visible radiant energy incident on the earth/year. Each year the photosynthetically reduced carbon is oxidized, either by living organisms for their survival, or by combustion. The result is that more CO2 is released into the atmosphere from the biota than is taken up by photosynthesis. The amount of carbon released by the biota is estimated to be 1-2 x 1015 grams of carbon/year. Added to this is carbon released by the burning of fossil fuels, which amounts to 5 x 1015 grams of carbon/year. The oceans mitigate this increase by acting as a sink for atmospheric CO2. It is estimated that the oceans remove about 2 x 1015 grams of carbon/year from the atmosphere. This carbon is eventually stored on the ocean floor. Although these estimates of sources and sinks are uncertain, the net global CO2 concentration is increasing. Direct measurements show that each year the atmospheric carbon content is currently increasing by about 3 x 1015 grams. … Based on predicted fossil fuel use and land management, it is estimated that the amount of CO2 in the atmosphere will reach 700 ppm within [this] century. (references omitted)What needs to happen, quickly, to reverse our rush to a climate from which there can be no near-term recovery, and to avoid Earth becoming as uninhabitable as Venus, is to accelerate photosynthesis while decelerating carbon emissions. Our allies in this are bacteria and fungi, as they were billions of years ago. They will do the heavy lifting if we just give them a little support. They need good growth conditions (like heat and moisture, which we should have in increasing abundance this century), nutrients, and space to breathe. Lose the antibacterial soaps and sprays, please. Planting gardens and tree crops is a start. Ecological restoration, where damage can be slowly unwound by greenery, is another step. Living roofs, tree-lined hardscapes, earth-sheltered homes: all of these are both adaptive and mitigating strategies for a recovering climate stasis. But there is something even more powerful. Tea from a Firehose This week we asked Joey “Mr Tea” Thomas to come dose the Ecovillage Training Center with his eclectic brew of liquid compost. Mr Tea’s recipe is as good as any batch of Biodynamic Preps or EM (Effective Micro-organisms) you might already be using. It is inestimably superior to MiracleGrow® or other commercial, bagged soil amendments. In a large stainless steel tank retrofitted with aerating pipes, Mr Tea combines de-chlorinated warm water and… - Folic Acid - Fish Oil Emulsion - Bat Guano - Feather Meal - Virgin Forest Soil - Deep Pasture Topsoil - Composted Animal Manure - Composted Kitchen Scraps - Composted Poultry Litter - Worm Castings & Liquor, and The kelp, fish oil, and most of the composts provide rich food for the microbes while they brew. The humates are million-year old deposits with diverse paleobacteria. The bat guano is drawn from distant caves rich in trace minerals and packed with still more varieties of exotic bacteria. The two kinds of soil contain a complex of two discrete living microbiomes, one the fungally-rich virgin forest and the other a bacterially dominated grasslands. The fine biochar particulates provide enough soil structure to retain water – about 10 times the volume of the biochar itself — and aerobic conditions, while providing a coral reef-like microbial habitat. The animal manures, worm castings, feather meal and compostables all contribute to the biodiversity of available microfauna. In the world of bacterial epigenetics, dictated by the particular demands of diverse members of the web in different seasons and weather conditions, this is a supermarket of genotypes that allow the bacteria to switch up and morph into whatever might be needed for soil health and fertility, capturing passing genes and unlocking regions of their DNA and RNA to provide new or ancient solutions to current conditions. Bandwidth permitting, you can watch this video that's so sexy it should be x-rated. This is a revolution disguised as organic gardening. The sex is going on right in front of the camera, you’d just need a microscope to see it. Use your imagination. If we want to stop global climate change while still surviving unpredictable and changing weather patterns, we’ll need to hold more water, nutrients and carbon in the soil. We can do that with a good diversity of healthy microorganisms and their byproducts. We're trying to increase the retention time of carbon in its solid form in the land for as long as possible, as opposed to allowing it to become gaseous, because that's when it becomes dangerous to our future. That is what climate farming, or what my friend Darren Doherty calls regrarianism, is all about. Its about improving the soil to heal the atmosphere. As we say in the clip, this is agriculture that builds rather than mines the soil and can transform our beloved home back into a garden planet.
<urn:uuid:9e8760a4-d64f-48e1-b2d8-9f7bb24e44ea>
CC-MAIN-2015-32
http://peaksurfer.blogspot.co.uk/2013_08_01_archive.html
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438043062635.98/warc/CC-MAIN-20150728002422-00079-ip-10-236-191-2.ec2.internal.warc.gz
en
0.931694
1,854
3.671875
4
With the University of Michigan’s latest production of a quantum chip, it’s another step forward for quantum computers that will someday dwarf the abilities of today’s machines. Working with individual ions or atoms – much smaller than the transistors of even the most advanced microchips - quantum computers may be both more powerful and more compact than existing computers by various orders of magnitude. Common computers today are thousands of times more powerful and more compact than the first 30 ton behemoths, but they use virtually the same logic. The fundamental design has gone unchanged for 50 years. Quantum computing is whole new ball game. The secret lies in the almost magical property of quantum matter to adopt two states simultaneously. Normal integrated circuits store data using transistors which have just two states – on and off. Each quantum circuit, or qubit, can represent at least three states: on, off or both by an effect called quantum superposition. This means much more data can be stored on each individual circuit. Actually, qubits can potentially contain many states. Dr Andrew White, Senior Lecturer in Physics at University of Queensland describes a qubit like this: “A quantum computer takes that on or off state and adds many different possible states. The first thing, if you think of the globe, let the South Pole be on, the North Pole off – that’s not a very good description of the globe. A quantum computer let’s you describe information by saying, look, you can take an arrow from Earth’s center and point it at the North Pole, South Pole or Los Angeles or London, and that’s richer description. You can fit much more information on a single qubit.” Based on Dr. White’s description, a single qubit could replace a whole bank of conventional memory. Normal memory holds a large array of binary numbers expressed as on or off transistors – ones or zeros. Many transistors are needed to express anything more than just a simple number – hence today’s computers need for large memories. For example: you need 8 bits plus one bit for error correction to store the binary number for 256 which is expressed as 11111111. Going back to our globe example, our arrow could point to Amsterdam which could represent 256 – or any other number. A single qubit could store more information than thousands of transistors. This compact storage leads to another advantage: speed. Without the need to access many memory locations to read data, retrieval is almost instantaneous. Quantum computers will represent a huge leap in processing power as well – they could execute instructions exponentially faster because there would be almost no limit to the size of the instruction. Currently, most computers use 32 or 64 bit instructions. There is another exciting benefit to working with quantum reactions: Entanglement. It describes the ability of quantum matter to “link” two particles. Change one particle and the other changes – instantaneously, even though there is no physical connection! And distance may be irrelevant! This property – not fully understood – would enable computers to talk to each other with no time lag over long distances. Anton Zeilinger at the Institute of Experimental Physics in Vienna, Austria, preformed an experiment to demonstrate entanglement: their group strung an optical-fiber cable in a sewer tunnel under the Danube River with an "entangled" photon at each end. They measured of the state of polarization in one photon (horizontal, vertical, etc…) establishing that the other proton immediately had an identical polarization. What will be the difference to normal computer users? Try instant access to any type of data – whether it is in your computer or on the other side of the planet. As for processing power, few users rarely exceed the abilities of today’s computers. Much computer hardware is used to generate the fancy graphical interface we call Windows – with plenty left over in reserve. Those not familiar with computer science are often surprised to learn there are still a few applications that cannot run easily on today’s computers. They lack of sufficient processing power to do climate modeling, artificial intelligence or break strong encryption. The NSA (National Security Agency) would love to be able to break many a foreign power’s encrypted communications, but has been stymied by the lack of a sufficiently fast computer for the job. Experts estimate it would take more than the lifetime of the Universe using all the computers in the world to break a 1024 bit encryption key – the current standard for serious encryption applications. It’s worth noting that most commercial encryption only uses a 40 bit key. A quantum computer has the potential to break any encryption in a few days. Scientists who study global warming and climate would like to have finer-grained models to be able to predict the weather more effectively and determine the real impact man’s activities have over the planet. Current computers, although fast, still take hours or days to produce weather simulations that lack detail. Artificial intelligence is another field that could use the extra processing power. Current algorithms simply can’t be processed fast enough and, admittedly, may need more refining. However, a quantum computer could theoretically contain more processing power than the human brain in a smaller space – making true AI possible. In fact, more powerful computers often come along well before a use is found for them. In the future, more uses will be found for quantum machines as their tremendous processing power becomes available. But having the machine is not enough. All of today’s software is based on the silicon technology it runs on. New software is already being written to take advantage of quantum computation. One of the most important steps is to write software for error checking. All computers use some type of system to make sure a bit hasn’t accidentally “flopped” from a one to a zero. Quantum computer components, because of their atomic size, will be very susceptible to errors. In fact, one of the biggest problems faced by the scientists working on quantum computing is the problem associated with checking the state of an object so small. How does one check the value of a qubit without changing it? Error checking will be of critical importance and computer scientists have already developed some ideas to insure accuracy in quantum systems. They have also already developed algorithms and equipment for super strong quantum encryption designed to allow hacker-proof security for communications. The National Security Agency and Federal Reserve banks can now buy a quantum cryptographic system from several companies. Anyone who intercepts and tries to read the stream of photons used will disturb the photons in a way that is detectable to both sender and receiver. Quantum encryption represents the first major commercial implementation for what has become known as quantum information science - a blending of quantum mechanics and information theory. As for the software you use in day-to-day computing, no changes will be necessary. Just as software emulators permit Apple users to run Windows and Windows software on the Mac’s Power PC processor – albeit sacrificing some speed – an emulator could quite easily run any programs today at speeds that make the today’s fastest processors look frozen. So you won’t need to run out and buy Microsoft Office 2030 for Quantum Computers – although Bill Gates, if he’s still alive, might like that. It may also change the way we do computing. Like times past when computers were very expensive, we may share a large, centralized quantum computer – one that has the capacity to handle quadrillions of transactions. Connections would be via fiber optic connections and personal data – a whole lifetimes worth – could be stored on a quantum USB-type memory the size of a credit card. This would eliminate the need to have millions of PCs that require upgrading every few years. Don’t expect any of this to happen tomorrow. Scientists are still struggling with some tough problems. Which is the best material from which to make quantum systems? How to check qubit values and not lose the information at the same time? What mechanisms are involved in entanglement? Some experts predict it will be 20 years before we see the first fully functional computers that use quantum materials. No mater how long it takes, money will continue to flow into research efforts. Silicon-based processors are beginning to near the physical limit of smallness and speed. Intel’s best processors currently fabricated using .15 micron process and run 3GHZ. One day we may have more processing power than we know what to do with. It will be up to our imaginations – something no computer may ever accurately match - to think of new problems for these enormously powerful machines to solve. by Philip Dunn, Copyright 2005 PhysOrg.com Explore further: Superfast fluorescence sets new speed record
<urn:uuid:1ad06434-ddd6-4841-ae44-b4a3bd0ed598>
CC-MAIN-2015-32
http://phys.org/news/2006-01-quantum.html
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042989043.35/warc/CC-MAIN-20150728002309-00057-ip-10-236-191-2.ec2.internal.warc.gz
en
0.9278
1,800
3.859375
4
Quantum teleportation, or entanglement-assisted teleportation, is a technique used to transfer quantum information from one quantum system to another. It does not transport the system itself, nor does it allow communication of information at superluminal (faster than light) speed. Neither does it concern rearranging the particles of a macroscopic object to copy the form of another object. Its distinguishing feature is that it can transmit the information present in a quantum superposition, useful for quantum communication and computation. More precisely, quantum teleportation is a quantum protocol by which a qubit a (the basic unit of quantum information) can be transmitted exactly (in principle) from one location to another. The prerequisites are a conventional communication channel capable of transmitting two classical bits (i.e. one of four states), and an entangled pair (b,c) of qubits, with b at the origin and c at the destination. (So whereas b and c are intimately related, a is entirely independent of them other than being initially colocated with b.) The protocol has three steps: measure a and b jointly to yield two classical bits; transmit the two bits to the other end of the channel (the only potentially time-consuming step, due to speed-of-light considerations); and use the two bits to select one of four ways of recovering c. The upshot of this protocol is to permute the original arrangement ((a,b),c) to ((b′,c′),a), that is, a moves to where c was and the previously separated qubits of the Bell pair turn into a new Bell pair (b′,c′) at the origin. Suppose Alice has a qubit in some arbitrary quantum state . (A qubit may be represented as a superposition of states, labeled and .) Assume that this quantum state is not known to Alice and she would like to send this state to Bob. Ostensibly, Alice has the following options: Option 1 is highly undesirable because quantum states are fragile and any perturbation en route would corrupt the state. Option 2 is forbidden by the no-broadcast theorem. Option 3 (classical teleportation) has also been formally shown to be impossible. (See the no teleportation theorem.) This is another way to say that quantum information cannot be measured reliably. Thus, Alice seems to face an impossible problem. A solution was discovered by Bennett, et al. The components of a maximally entangled two-qubit state are distributed to Alice and Bob. The protocol then involves Alice and Bob interacting locally with the qubit(s) in their possession and Alice sending two classical bits to Bob. In the end, the qubit in Bob's possession will be in the desired state. Assume that Alice and Bob share an entangled qubit AB. That is, Alice has one half, A, and Bob has the other half, B. Let C denote the qubit Alice wishes to transmit to Bob. Alice applies a unitary operation on the qubits AC and measures the result to obtain two classical bits. In this process, the two qubits are destroyed. Bob's qubit, B, now contains information about C; however, the information is somewhat randomized. More specifically, Bob's qubit B is in one of four states uniformly chosen at random and Bob cannot obtain any information about C from his qubit. Alice provides her two measured classical bits, which indicate which of the four states Bob possesses. Bob applies a unitary transformation which depends on the classical bits he obtains from Alice, transforming his qubit into an identical re-creation of the qubit C. Suppose Alice has a qubit that she wants to teleport to Bob. This qubit can be written generally as: Alice takes one of the particles in the pair, and Bob keeps the other one. The subscripts A and B in the entangled state refer to Alice's or Bob's particle. We will assume that Alice and Bob share the entangled state . So, Alice has two particles (C, the one she wants to teleport, and A, one of the entangled pair), and Bob has one particle, B. In the total system, the state of these three particles is given by Alice will then make a partial measurement in the Bell basis on the two qubits in her possession. To make the result of her measurement clear, we will rewrite the two qubits of Alice in the Bell basis via the following general identities (these can be easily verified): The three particle state shown above thus becomes the following four-term superposition: Notice all we have done so far is a change of basis on Alice's part of the system. No operation has been performed and the three particles are still in the same state. The actual teleportation starts when Alice measures her two qubits in the Bell basis. Given the above expression, evidently the result of her (local) measurement is that the three-particle state would collapse to one of the following four states (with equal probability of obtaining each): Alice's two particles are now entangled to each other, in one of the four Bell states. The entanglement originally shared between Alice's and Bob's is now broken. Bob's particle takes on one of the four superposition states shown above. Note how Bob's qubit is now in a state that resembles the state to be teleported. The four possible states for Bob's qubit are unitary images of the state to be teleported. The crucial step, the local measurement done by Alice on the Bell basis, is done. It is clear how to proceed further. Alice now has complete knowledge of the state of the three particles; the result of her Bell measurement tells her which of the four states the system is in. She simply has to send her results to Bob through a classical channel. Two classical bits can communicate which of the four results she obtained. After Bob receives the message from Alice, he will know which of the four states his particle is in. Using this information, he performs a unitary operation on his particle to transform it to the desired state : to recover the state. to his qubit. Teleportation is therefore achieved. Experimentally, the projective measurement done by Alice may be achieved via a series of laser pulses directed at the two particles. In the literature, one might find alternative, but completely equivalent, descriptions of the teleportation protocol given above. Namely, the unitary transformation that is the change of basis (from the standard product basis into the Bell basis) can also be implemented by quantum gates. Direct calculation shows that this gate is given by Entanglement can be applied not just to pure states, but also mixed states, or even the undefined state of an entangled particle. The so-called entanglement swapping is a simple and illustrative example. If Alice has a particle which is entangled with a particle owned by Bob, and Bob teleports it to Carol, then afterwards, Alice's particle is entangled with Carol's. A more symmetric way to describe the situation is the following: Alice has one particle, Bob two, and Carol one. Alice's particle and Bob's first particle are entangled, and so are Bob's second and Carol's particle: ___ / \ Alice-:-:-:-:-:-Bob1 -:- Bob2-:-:-:-:-:-Carol \___/ Now, if Bob performs a projective measurement on his two particles in the Bell state basis and communicates the results to Carol, as per the teleportation scheme described above, the state of Bob's first particle can be teleported to Carol's. Although Alice and Carol never interacted with each other, their particles are now entangled. One can imagine how the teleportation scheme given above might be extended to N-state particles, i.e. particles whose states lie in the N dimensional Hilbert space. The combined system of the three particles now has a N3 dimensional state space. To teleport, Alice makes a partial measurement on the two particles in her possession in some entangled basis on the N2 dimensional subsystem. This measurement has N2 equally probable outcomes, which are then communicated to Bob classically. Bob recovers the desired state by sending his particle through an appropriate unitary gate. A general teleportation scheme can be described as follows. Three quantum systems are involved. System 1 is the (unknown) state ρ to be teleported by Alice. Systems 2 and 3 are in a maximally entangled state ω that are distributed to Alice and Bob, respectively. The total system is then in the state where Tr12 is the partial trace operation with respect systems 1 and 2, and denotes the composition of maps. This describes the channel in the Schrödinger picture. Taking adjoint maps in the Heisenberg picture, the success condition becomes for all observable O on Bob's system. The tensor factor in is while that of is . The proposed channel Φ can be described more explicitly. To begin teleportation, Alice performs a local measurement on the two subsystems (1 and 2) in her possession. Assume the local measurement have effects If the measurement registers the i-th outcome, the overall state collapses to The tensor factor in is while that of is . Bob then applies a corresponding local operation Ψi on system 3. On the combined system, this is described by where Id is the identity map on the composite system . Therefore the channel Φ is defined by Notice Φ satisfies the definition of LOCC. As stated above, the teleportation is said to be successful if, for all observable O on Bob's system, the equality holds. The left hand side of the equation is: where Ψi* is the adjoint of Ψi in the Heisenberg picture. Assuming all objects are finite dimensional, this becomes The success criterion for teleportation has the expression
<urn:uuid:03071894-88b1-4ee2-bb48-b1c9e5a93f1d>
CC-MAIN-2015-32
http://www.thefullwiki.org/Quantum_teleportation
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042988317.67/warc/CC-MAIN-20150728002308-00232-ip-10-236-191-2.ec2.internal.warc.gz
en
0.93462
2,029
4.03125
4
Photonic chips go 3D Technology Research News The dream of building computer chips that use light signals rather than electricity has entered the realm of serious research in recent years with the advent of photonic crystal, a material that blocks and channels light within extremely small spaces. Producing practical photonic crystal chips, however, includes several challenges: making three-dimensional devices that emit light from specific points, emit at the wavelengths used by today's optical telecommunications equipment and can be manufactured using processes suited to mass production. Research teams from the Massachusetts Institute of Technology and from Kyoto University have made devices that meet all three challenges. The techniques could be used to make smaller, more efficient communications devices; create optical memory and quantum computing and communications devices; develop new types of lasers and biological and chemical sensors; and could ultimately lead to all-optical computer processors. The semiconductor industry took off with the advent of a practical and low-cost method of integrating a large number of transistors into a single chip, said Minghao Qi, a research assistant at MIT. "It is natural then to envision the possibility of integrated photonics, where information is processed fully in the optical domain [at the high] bandwidth of photons," Photonic crystal is usually made from the same semiconductor materials as computer chips using common chipmaking techniques like photolithography. It contains regularly spaced gaps of air or other materials that form boundaries within the crystal that refract, or bend, specific wavelengths of light. Refraction is responsible for the illusion that a drinking straw bends at the air-liquid boundary. Portions of the materials that do not contain gaps channel light within the crystal and emit light from it. The MIT photonic chip has seven layers that each contain two types of two-dimensional photonic crystal. One type is an arrangement of rods surrounded by air and the other type is solid material perforated with air holes. The rod slab is positioned above the hole slab in each layer, and the layers are offset to produce steps. The holes are about 500 nanometers in diameter, or about one-tenth the size of a red blood cell. The material blocks light at wavelengths of 1.3, 1.4 and 1.5 microns. Telecommunications systems use near-infrared 1.3- and 1.55-micron wavelengths. The researchers filled specific air holes and gaps between rods during the manufacturing process to create solid areas, or defects, that emit light. "A critical goal in photonic crystal [research] is the ability to put arbitrary defects with precisely controlled shapes and sizes at designed locations," said Qi. The two types of two-dimensional photonic crystal in each layer of the three-dimensional crystal also allow for polarization control, said Qi. A light beam's electric field is ordinarily oriented in a plane perpendicular to the beam. The electric field of polarized light is confined to one direction within the plane. Controlling polarization is important because transferring light signals from photonic crystal to optical fibers requires matching the polarizations of the devices, he said. The crystal is more efficient than previous three-dimensional photonic crystals, and the seven layers can be formed in four processing steps, said Qi. The Kyoto University team has advanced its existing woodpile-structured three-dimensional photonic crystal with a method to make solid areas in specific locations and have shown that the material precisely controlled light, said Susumu Noda, a professor of electronic science and engineering at Kyoto University. The woodpile photonic crystal consists of perpendicular layers of semiconductor rods. The researchers' design calls for 200-nanometer-wide rods spaced 700 nanometers center to center. The photonic crystal controls The researchers also sandwiched a light source inside their photonic crystal, which is a step toward fully integrated optical devices, said The MIT process could be used to make practical telecommunications devices and biological and chemical sensors in two to three years, said Qi. High-quality devices that could be coupled to optical fiber could take five years, he said. Simple all-optical computer chips could take 10 years to develop, he said. Devices based on the Kyoto method could become practical in five to ten years, said Susumu. Qi's research colleagues were Elefterios Lidorikis, Peter Rakich, Stephen Johnson, John Joannopoulos, Erich Ippen and Henry Smith. The work appeared in the June 3, 2004 issue of Nature. The research was funded by the National Science Foundation (NSF). Susumu's research colleagues were Shinpei Ogawa, Masahiro Imada, Susumu Yoshimoto and Makoto Okano. The work appeared in the June 3, 2004 issue of Sciencexpress. The research was funded by Core Research for Evolution Science and Technology (CREST), Japan Science and Technology Agency (JST), and the Ministry of Education, Culture, Sports, Science and Technology (MEXT) of Japan. Timeline: 2-3 years, 5 years, 7-8 years, Funding: Government, Corporate TRN Categories: Optical Computing, Optoelectronics and Photonics; Materials Science and Engineering Story Type: News Related Elements: Technical paper, "Control of Light Emission by 3D Photonic Crystals", Sciencexpress, June 3, 2004; technical paper, "A Three-dimensional Optical Photonic Crystal with Design Point Defects," Nature, June 3, 2004 July 28/August 4, 2004 Photonic chips go 3D Online popularity tracked Summarizer gets the idea Electric fields assemble silicon on plastic fast laser tweezer chains make quantum wires Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:8b809e4a-dfaf-4232-a0b5-4c49ac90acea>
CC-MAIN-2015-32
http://www.trnmag.com/Stories/2004/072804/Photonic_chips_go_3D_072804.html
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042988051.33/warc/CC-MAIN-20150728002308-00326-ip-10-236-191-2.ec2.internal.warc.gz
en
0.89408
1,273
3.828125
4
RSA was first described in 1977 by Ron Rivest, Adi Shamir and Leonard Adleman of the Massachusetts Institute of Technology. Public-key cryptography, also known as asymmetric cryptography, uses two different but mathematically linked keys, one public and one private. The public key can be shared with everyone, whereas the private key must be kept secret. In RSA cryptography, both the public and the private keys can encrypt a message; the opposite key from the one used to encrypt a message is used to decrypt it. This attribute is one reason why RSA has become the most widely used asymmetric algorithm: It provides a method of assuring the confidentiality, integrity, authenticity and non-reputability of electronic communications and data storage. Many protocols like SSH, OpenPGP, S/MIME, and SSL/TLS rely on RSA for encryption and digital signature functions. It is also used in software programs -- browsers are an obvious example, which need to establish a secure connection over an insecure network like the Internet or validate a digital signature. RSA signature verification is one of the most commonly performed operations in IT. Explaining RSA's popularity RSA derives its security from the difficulty of factoring large integers that are the product of two large prime numbers. Multiplying these two numbers is easy, but determining the original prime numbers from the total -- factoring -- is considered infeasible due to the time it would take even using today’s super computers. The public and the private key-generation algorithm is the most complex part of RSA cryptography. Two large prime numbers, p and q, are generated using the Rabin-Miller primality test algorithm. A modulus n is calculated by multiplying p and q. This number is used by both the public and private keys and provides the link between them. Its length, usually expressed in bits, is called the key length. The public key consists of the modulus n, and a public exponent, e, which is normally set at 65537, as it's a prime number that is not too large. The e figure doesn't have to be a secretly selected prime number as the public key is shared with everyone. The private key consists of the modulus n and the private exponent d, which is calculated using the Extended Euclidean algorithm to find the multiplicative inverse with respect to the totient of n. A simple, worked example Alice generates her RSA keys by selecting two primes: p=11 and q=13. The modulus n=p×q=143. The totient of n ϕ(n)=(p−1)x(q−1)=120. She chooses 7 for her RSA public key e and calculates her RSA private key using the Extended Euclidean Algorithm which gives her 103. Bob wants to send Alice an encrypted message M so he obtains her RSA public key (n, e) which in this example is (143, 7). His plaintext message is just the number 9 and is encrypted into ciphertext C as follows: Me mod n = 97 mod 143 = 48 = C When Alice receives Bob’s message she decrypts it by using her RSA private key (d, n) as follows: Cd mod n = 48103 mod 143 = 9 = M To use RSA keys to digitally sign a message, Alice would create a hash or message digest of her message to Bob, encrypt the hash value with her RSA private key and add it to the message. Bob can then verify that the message has been sent by Alice and has not been altered by decrypting the hash value with her public key. If this value matches the hash of the original message, then only Alice could have sent it (authentication and non-repudiation) and the message is exactly as she wrote it (integrity). Alice could, of course, encrypt her message with Bob’s RSA public key (confidentiality) before sending it to Bob. A digital certificate contains information that identifies the certificate's owner and also contains the owner's public key. Certificates are signed by the certificate authority that issues them, and can simplify the process of obtaining public keys and verifying the owner. Security of RSA As discussed, the security of RSA relies on the computational difficulty of factoring large integers. As computing power increases and more efficient factoring algorithms are discovered, the ability to factor larger and larger numbers also increases. Encryption strength is directly tied to key size, and doubling key length delivers an exponential increase in strength, although it does impair performance. RSA keys are typically 1024- or 2048-bits long, but experts believe that 1024-bit keys could be broken in the near future, which is why government and industry are moving to a minimum key length of 2048-bits. Barring an unforeseen breakthrough in quantum computing, it should be many years before longer keys are required, but elliptic curve cryptography is gaining favor with many security experts as an alternative to RSA for implementing public-key cryptography. It can create faster, smaller and more efficient cryptographic keys. Much of today’s hardware and software is ECC-ready and its popularity is likely to grow as it can deliver equivalent security with lower computing power and battery resource usage, making it more suitable for mobile apps than RSA. Finally, a team of researchers which included Adi Shamir, a co-inventor of RSA, has successfully determined a 4096-bit RSA key using acoustic cryptanalysis, however any encryption algorithm is vulnerable to this type of attack. The inventors of the RSA algorithm founded RSA Data Security in 1983. The company was later acquired by Security Dynamics, which was in turn purchased by EMC Corporation in 2006. The RSA algorithm was released to the public domain by RSA Security in 2000. Continue Reading About RSA algorithm (Rivest-Shamir-Adleman) Margaret Rouse asks: Given the various stories linking RSA Security to the NSA’s attempts to weaken encryption products and subvert cryptography standards, how much faith do you have in the RSA cryptosystem and today’s popular encryption algorithms? 1 ResponseJoin the Discussion
<urn:uuid:87d1f7b9-7958-4fb9-9593-8690dec0a6bf>
CC-MAIN-2015-32
http://searchsecurity.techtarget.com/definition/RSA
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042988650.6/warc/CC-MAIN-20150728002308-00185-ip-10-236-191-2.ec2.internal.warc.gz
en
0.944887
1,258
4.28125
4
String theory was originally developed to try and describe the fundamental particles and forces that make up our universe. Over the last 25 years, string theory has become some physicists' contender for a 'theory of everything', reconciling particle physics with cosmology - a puzzle that tormented Einstein for the last 30 years of his life. It contends that the subatomic particles found in nature, such as electrons and quarks, may not be particles at all but instead tiny vibrating strings. String theorists said our universe is 10-dimensional but during the big bang, 6 of those 10 dimensions curled up into a tiny ball and the remaining '4' (they count time as a dimension even though it relies on the other three dimensions) expanded explosively, providing us with the universe we know and love, including the cast of "Jersey Shore". How did these six dimensions compactify? There's no mathematical basis for topology and properties of these higher-dimensional universes. Where do strings come from? No one knew so what we ended up with were multiple 'string theories', which means it stands a chance of not being a theory at all. Some even proposed M-theory (11-dimensions) to get away from focusing on strings entirely.(1) There's no shortage of instances where theory, deduction or inference have survived being falsifiable just fine and later been proven to be correct but in a modern science world a half dozen 'theories of a theory' won't get much traction outside people who want funding. An upcoming article in Physical Review Letters says it can change all that and make string theory experimental. Their reasoning? They say string theory seems to predict the behavior of entangled quantum particles and since that prediction can be tested in the laboratory, they can now test string theory - predicting how entangled quantum particles behave provides the first opportunity to test string theory by experiment because quantum entanglement can be measured in the lab.(2) There is no obvious connection to explain why a theory that is being developed to describe the fundamental workings of our universe is useful for predicting the behavior of entangled quantum systems but if it checks out, it will be an interesting insight. "This will not be proof that string theory is the right 'theory of everything' that is being sought by cosmologists and particle physicists. However, it will be very important to theoreticians because it will demonstrate whether or not string theory works, even if its application is in an unexpected and unrelated area of physics," says professor Mike Duff, lead author of the study from the Department of Theoretical Physics at Imperial College London. "If experiments prove that our predictions about quantum entanglement are correct, this will demonstrate that string theory 'works' to predict the behaviour of entangled quantum systems. "This may be telling us something very deep about the world we live in, or it may be no more than a quirky coincidence. Either way, it's useful." Article: M. J. Duff , L. Borsten, D. Dahanayke , W. Rubens, A. Marrani, 'Four-qubit entanglement from string theory', arXiv:1005.4915v2 and Physical Review Letters 2010 (in press) (1) String theory String theory, and its extension M-theory, are mathematical descriptions of the universe. They have been developed, over the last 25 years, by theoreticians seeking to reconcile the theories of general relativity and quantum mechanics. (The former describes the universe at the level of cosmology – the very large, while the latter describes the universe at the level of particle physics – the incredibly small). One of the major bugbears, especially of M-theory, is that it describes billions of different universes and ‘anything’ can be accommodated in one or other of the M-theory universes. Researchers have no way of testing which of the answers that string/M-theory gives us is ‘right’. Indeed, they all may be right and we live in one universe among an infinite number of universes. So far no one has been able to make a prediction, using string theory, that can be tested to see if it is correct or not. (2) Qubit (quantum bit) entanglement Under very precisely controlled conditions it is possible to entangle the properties of two quantum particles (two quantum bits, or qubits), for example two photons. If you then measure the state of one of these entangled particles, you immediately affect the state of its partner. And this is true if the particles are close to one another or separated by enormous distance. Hence Einstein’s apposite description of quantum entanglement as ‘spooky action at a distance’. It is possible to entangle more than two qubits, but calculating how the particles are entangled with one another becomes increasingly complex as more particles are included. Duff and colleagues say they realized that the mathematical description of the pattern of entanglement between three qubits resembles the mathematical description, in string theory, of a particular class of black holes. Thus, by combining their knowledge of two of the strangest phenomena in the universe, black holes and quantum entanglement, they realized they could use string theory to produce a prediction that could be tested. Using the string theory mathematics that describes black holes, they predicted the pattern of entanglement that will occur when four qubits are entangled with one another. (The answer to this problem has not been calculated before.) Although it is technically difficult to do, the pattern of entanglement between four entangled qubits could be measured in the laboratory and the accuracy of this prediction tested. - PHYSICAL SCIENCES - EARTH SCIENCES - LIFE SCIENCES - SOCIAL SCIENCES Subscribe to the newsletter Stay in touch with the scientific world! Know Science And Want To Write? - New Ice Age Is Coming, By 2030, Says Analysis - New Results From The LHC At 13 TeV! - Will Aspartame Critics Now Be Less Bitter? - Grasping How The Brain Plans Gripping Motion - Kepler 452b - Things That Could Go Wrong With Habitability & If It Is - Could We Detect Intelligent Life There? - Atheism Peaks, While Spiritual Groups Move Toward Convergence - How Did Mexico Eliminate Breast Cancer Deaths? - "There are two zero-entries data points not shown (semi-log graph).Nobody gets excited by two events..." - "That was a pun! Excesses usually appear toward the end of a spectrum... and the energy being 1..." - "I'm not so sure about the point on the generic drug manufacturers abandoning a generic once it..." - "Reproduction studies need some way to get published, and then somehow, some mechanism needs to..." - "Hi T, Thanks for the note. Great LHC is back in business and your summary helps us get up to date!..."
<urn:uuid:6d291fab-2d1f-4c2b-b451-3e245d3d787e>
CC-MAIN-2015-32
http://www.science20.com/news_articles/string_theory_testing_untestable
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042987155.85/warc/CC-MAIN-20150728002307-00112-ip-10-236-191-2.ec2.internal.warc.gz
en
0.914445
1,449
3.703125
4
SAN JOSE, Calif. IBM's Almaden Research Center unveiled the world's largest quantum computer to date a 5-bit computer squeezed onto a single molecule at the Hot Chips conference last week. The five fluorine atoms in the molecule each represent a quantum bit, or "qubit," which made the computer the first ever capable of solving a problem related to code cracking, called the order-finding problem, in a single step. "Every other computer in the world takes several steps to solve the order-finding problem, but our quantum computer solved it in a single step," said Stanford University researcher Lieven Vandersypen. The quantum computer was invented by IBM Almaden Research Center researcher Isaac Chuang, who led a team of scientists that included fellow researchers Gregory Breyta and Costantino Yannoni of IBM Almaden, professor Richard Cleve of the University of Calgary, and researchers Matthias Steffen and Lieven Vandersypen from Stanford University. Long way to go Since the late 1980s Chuang has been pursuing ever-more-sophisticated realizations of quantum computers. His last effort was a 3-qubit machine. While the latest version represents a rapid advance for the field, quantum computing still has a long way to go before it will compete with leading-edge supercomputers. But researchers in the field are optimistic that machines of competitive size will appear in this decade. That optimism was reflected in a statement by IBM's Chuang. "This result gives us a great deal of confidence in understanding how quantum computing can evolve into a future technology," Chuang said. "It reinforces the growing realization that quantum computers may someday be able to live up to their potential of solving, in remarkably short times, problems that are so complex that the most powerful supercomputers couldn't calculate the answers even if they worked on them for millions of years." The order-finding problem determines the period of a function. In digital computers, that requires a step-by-step iterative solution of the function's values until they begin to repeat. The quantum computer, however, solved the order-finding problem without any iteration steps. Its ability to obtain a single-step solution can be traced to the nature of qubits. Because quantum bits simultaneously represent all possible values of the input variables, the single step of a quantum computer considers every possible input value at once. Hence, the single step can solve problems of any size. That represents the ultimate in parallel processing: parallelism at the bit level. While a quantum computation starts and ends with information encoded in binary bits, the computation itself is performed in the mysterious realm of quantum mechanics, where a physical system can be in what is known as a superposition of states. Using nuclear magnetic resonance, it is possible to measure whether the constituent particles of an atom the protons and neutrons in its nucleus are spinning in one direction or another. Such a measurement represents the output stage of the computation, since once observed, an elementary particle's spin becomes fixed in one of its binary states: spin up or spin down. One spin direction is designated "0" and the other designated "1," but both are only probabilities during the course of a computation. IBM's 5-bit quantum computer used the spin configuration in fluorine atoms to represent qubits, but other experiments have employed the spin of carbon or oxygen atoms. Logic is performed in any quantum computer when its qubit atoms affect the spin of neighboring qubit atoms. When structured properly, the quantum-computer atom can perform a number of mathematical operations in parallel. In short, quantum operations are the reversible generalizations of standard digital operations. A quantum computer's basic operations are often compared with probabalistic algorithms. Many probabalistic algorithms can be sped up by random search methods. For instance, the occasional multiplying of intermediate results by a random number and checking for performance gains can uncover shortcuts in gradient descent algorithms that would otherwise have to be discovered one by one via exhaustive searches. Such probabalistic circuits can always be transformed into larger, slower deterministic circuits. For the two-element order-finding problem here, a deterministic circuit requires an exponential number of steps (four), while a probabalistic circuit only requires a polynomial number of steps (less than four, but more than one). The quantum circuit required only a single step. Quantum circuits extend the probabilistic notion by generalizing the one-way operations of both deterministic and probabilistic operations into reversible operations. Qubits enable reversible procedures by taking on not only the 1 and 0 symbolized by all bits but also holding a vector length expressed as a complex number. Thus, a qubit encodes both a spin direction expressed as 1 or 0 and a complex amplitude. The resulting qubit vector is angled into a multidimensional space equal in dimension to the number of qubits stored in a system. A 2-bit qubit computer thus can solve, in a single step, problems represented in two-dimensional space. In IBM's 5-qubit quantum computer, only two qubits could be used, since three had to be reserved for calculating the result. Consequently, IBM's 5-qubit quantum computer was only able to solve 2-bit problems. However, future multibit quantum computers should in theory scale up to higher dimensional spaces merely by adding qubits. The biggest problem with quantum computers is that the quantum states must remain unobserved, or "decoherence" will spoil their ability to take on different values simultaneously and therefore disable them from solving problems in a single step. That problem stymied the first attempts to build quantum computers, since any isolated particle, such as an electron or photon, could be too easily disturbed by environmental effects. The first successes at Harvard, MIT and IBM's Almaden Research Center were based on the use of atoms as stable environments for particle states. IBM's answer here, suggested earlier this year in a paper by Cleve, was to tack on a quantum Fourier transform of the results and then decode the answer from its spectrum. A quantum Fourier transform essentially performs a discrete Fourier transform on the amplitude part of the qubit vector, allowing the final quantum state of the computer to be inferred by humans after their examination of the qubit atom's frequency spectrum, rather than through direct observation of the probabilistic state. The specific problem IBM's 5-qubit computer tackled was the ordering problem, a precursor to the integer-factoring algorithm used to decode encyphered data. Traditional deterministic algorithms must exponentially expand computing resources as problems get bigger, but with a quantum computer it takes the same number of steps to solve any size problem, as long as there are enough qubits to encode the variables and the algorithm. Thus, the CPU's power scales up exponentially with the number of bits in its "registers." That fact, and a practical quantum-based algorithm for factoring large numbers, was first published in 1994 by Peter Shor of AT&T Research. Since encryption plays a central role in national security, Shor's result stimulated a round of government funding in quantum-computer research. Two bits, four ways The order-finding problem used in IBM's demonstration essentially found the period of a 2-bit function. Here, that amounted to repeatedly permuting a 2-bit string with the same function four possible ways for two bits until returning to the original 2-bit value. The number of permutations needed is equal to the period of the "ordering" function. According to a statement by IBM, this is the most complex algorithm yet solved by an actual quantum computer. For the five-qubit quantum computer, a two-element ordering problem was encoded by putting the starting state in the first two qubits. Radio frequency pulses were used to put the five atoms into the correct starting state, then a second set of pulses induced a single permutation of the function in the two "input atoms," and the interactions among the spins caused the answer to be calculated. "In essence, the quantum computer simultaneously answers what happens after one, two, three and four permutations; then we take the quantum Fourier transform to find the period," said Vandersypen. The quantum Fourier transform was measured with standard laboratory magnetic resonance equipment, permitting the results to be read out from the spectrum of three qubit atoms. Once these molecular methods mature, they may actually represent an advance over silicon-circuit fabrication processes. Instead of having to define complex circuits from the ground up using photographically reduced drawings, quantum circuits could be fabricated using automatic controlled chemical reactions. Current research indicates that the use of internal nuclear states is a stable environment for the representation and modification of qubits.
<urn:uuid:0817851d-9735-4969-8819-4707ab5398e6>
CC-MAIN-2015-32
http://www.eetimes.com/document.asp?doc_id=1142044
s3://commoncrawl/crawl-data/CC-MAIN-2015-32/segments/1438042987628.47/warc/CC-MAIN-20150728002307-00056-ip-10-236-191-2.ec2.internal.warc.gz
en
0.928333
1,796
3.5625
4
Story 2 - 22/10/2012 A Wind Tunnel Simulating quantum phenomena on today’s computers can be extremely challenging. Yet, just like the wind tunnel changed the trajectory of modern aviation, new specially built quantum simulators may soon guide the design of tailor-made quantum materials. for Quantum Physics Now also in Spanish Inside the quantum wind tunnel. The individual ions (dots) behave like tiny magnet bars (spins) which can orient themselves according to their neighbors and reproduce very complex quantum phenomena. Un túnel de viento para la física cuántica Um túnel de vento para a física quântica, brought to you by Optics and Photonics Latin America In the early days of aeronautics, computers that could simulate the subtle laws of aerodynamics did not exist. For this reason, people built wind tunnels to study the physics of flight on model systems. Quantum physicists are now facing a similar situation where they cannot satisfactorily simulate the behavior of many interesting quantum systems on even the most powerful computers. Scientists have, therefore, been looking for ways to build some physical systems that may be used as quantum simulators , the quantum analog of wind tunnels. This has so far been possible only for relatively simple quantum systems, involving only a few dozens of interacting quantum particles. However, many interesting effects are believed to happen in the presence of a larger number of interacting particles. Joe Britton, at the National Institute of Standards and Technology (NIST) in Boulder, Colorado, USA, and coworkers have now demonstrated rudimentary operation of a quantum simulator with hundreds of simultaneously interacting quantum particles. The NIST machine may well be a stepping-stone on our way towards a rational design of quantum materials. Why simulate? After all, the laws of aerodynamics and quantum physics can be described using mathematical laws. Interestingly, however, these laws lead to very subtle and complex phenomena. On the one hand, air flowing around the wing of an aircraft, for example, frequently leads to rather strong turbulence. On the other hand, interacting quantum particles can depend on each other without exchanging any signals. Both situations can be described mathematically. At the same time, however, they are excruciatingly difficult to reproduce with computer simulations. The turbulent flow of air is a chaotic phenomenon for which tiny inaccuracies of the computer simulation can potentially lead to totally wrong predictions. Interacting quantum particles have many degrees of freedom, and this easily brings even today’s largest supercomputers to their knees. Wind tunnels and quantum simulators are used to reproduce a phenomenon directly in a controlled environment. Instead of mathematically describing the air particles, they use real air streams; instead of writing mathematical models of quantum particles, they use quantum systems such as ions to actually produce an interesting effect. And the wind tunnel or simulator allows the experimenters to easily modify experimental conditions such as wind speeds or interaction strengths. What questions can be answered using these tools? The history of aviation may give us some hints: in the early 1900s even the slightest glitch in the design of an airplane or parachute would often mean the certain death of the pilot. Yet, in order to advance the times’ understanding of aerodynamics, it was essential to identify what aspects of avian flight were generating the lift: was it the movement of their wings? The shape of their wings? Or their material ? And how could artificial wings that generated enough lift to make humans fly be built? To answer these questions, the Wright brothers and many of their peers studied smaller scale models in very basic wind tunnels. After all, these experiments were far less deadly than jumping off cliffs. And after countless attempts and fails, people had enough knowledge and ingenuity to build the first airplanes. Today’s quantum science and engineering are not dissimilar to the science and engineering of early aviation: how can we sustain specific quantum phenomena? How can we suppress unwanted interference and interaction between our quantum system and the uncontrolled environment? The main reason for the complexity of quantum physics is quantum superposition and entanglement – arguably the most striking difference between classical and quantum physics. A classical binary digit (bit) can represent either the number 0 or the number 1. Therefore, to describe N bits, only N numbers are required. This binary language is the foundation of all of our present day, classical computers. We may think of a classical bit as a coin that is either heads or tails. A quantum bit (qubit) is the quantum analog of a classical bit and can represent any combination of the numbers 0 and 1. And how are the qubits implemented in the present experiment? "The outermost electron of each ion," Britton explains, "acts as a tiny quantum magnet, known as spin , and is utilized as a qubit. Quantum mechanics permits this spin to be in a superposition of states, for example simultaneously oriented parallel to, and antiparallel to, a laboratory magnetic field." Moreover, physically well-separated particles may be tightly interconnected, that is entangled. For example, two qubits can be in a state where both are always measured either in their 0 state or in their 1 state. If we were to add a third particle, its 0 and 1 states could each depend on any combination of the other two particles. This fact leads to an exponential growth in the number of variables: to represent N qubits, approximately 2N numbers are required. This leads to insurmountable problems when trying to simulate quantum systems on today’s classical computers: the amount of memory and the number of computations required are simply too much to handle even for our most powerful supercomputers. A different approach is therefore needed to study large interacting quantum systems. And this is where quantum simulators come in. By offering the ability to recreate interesting quantum effects in a controlled model setup, quantum simulators are expected to boost our understanding and advance engineering. Instead of air flow, quantum simulations often consider quantum bits, called spins. Here, we can think of a spin as a tiny magnet bar that can orient itself arbitrarily in three dimensions: "Previous experiments," Britton explains, "have used only about a dozen interacting spins. The NIST simulator, in contrast, permits controlled interaction of as many as 450 spins." One field where quantum simulators can provide insight is the study of quantum phase transitions. Transitions between classical phases — gas, liquid, solid — are driven by thermal fluctuations. Quantum phase transitions, in contrast, are a consequence of quantum fluctuations that are present even at zero temperature. And, as we have seen above, this is the worst combination when it comes to computer simulations. For a quantum simulator, on the other hand, entanglement would simply be a feature of the setup, not a problem that needs to be addressed with tremendous amounts of RAM. NIST’s quantum simulator would be suitable for simulating quantum magnetism, interacting spins arranged on the nodes of a flat grid — scientifically known as quantum Ising interactions on a two-dimensional lattice. "The Ising model," he continues, "describes a simple pair-wise interaction between pairs of spins on a lattice. Among its applications is explaining how weak short-range interactions can give rise to long-range ordering and bulk magnetization. Yet another application is the calculation of phase transitions in magnetic materials (e.g. from paramagnetic to ferromagnetic)." "Britton’s simulator finally gets us closer to having a usable quantum simulator," says Tobias Schätz from the Max Planck Institute of Quantum Optics in Garching, Germany. "Different groups have studied a variety of possibilities to building quantum simulators, and it now seems that trapped ions have allowed us to really get to the next level: that of developing quantum simulators that can simulate systems too complex for even our most powerful computers. Of course, we still have to study these new experiments very well to see how far we can trust their results, but I am very excited about Britton’s results." "Our technological world," Britton concludes, "depends greatly on ‘simple’ quantum devices like the Global Positioning System (GPS) and lasers. What’s needed is simulation support to guide the development of quantum materials such as, for example, high temperature superconductors." A. Niederberger, Visible and Entangled, Opt. Photon. Focus 3 , 7 (2008). A. Friedenauer, H. Schmitz, J. T. Glueckert, D. Porras & T. Schätz, Simulating a quantum magnet with trapped ions, Nat. Phys. 4 , 757-761 (2008). 2012 © Optics & Photonics Focus AN is a Research Associate with the Department of Applied Physics at Stanford University, California, USA. His research focuses on quantum circuits, modeling of nano-photonic devices, and numerical optimization. Joseph W. Britton, Brian C. Sawyer, Adam C. Keith, C.-C. Joseph Wang, James K. Freericks, Hermann Uys, Michael J. Biercuk & John J. Bollinger, Engineered two-dimensional Ising interactions in a trapped-ion quantum simulator with hundreds of spins, Nature (2012) 484, 489-492 (link).
<urn:uuid:01c40dab-ce05-46fe-8d8b-cd8652ee5463>
CC-MAIN-2019-26
http://opfocus.org/index.php?topic=story&v=18&s=2
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998879.63/warc/CC-MAIN-20190619003600-20190619025600-00245.warc.gz
en
0.924119
1,939
3.640625
4
FREEDOM AND SAFETY Quantum computers are making all the headlines these days, but quantum communication technology may actually be closer to practical implementation. In a bid to hasten its arrival, researchers have now mapped out the path to a quantum internet. The building blocks for these emerging technologies are more or less the same. They both use qubits to encode information-the quantum equivalent to computer bits that can simultaneously be both 1 and 0 thanks to the phenomena of superposition. And they both rely on entanglement to inextricably link the quantum states of these qubits so that acting on one affects the other. But while building quantum computers capable of outperforming conventional ones on useful problems will require very large networks of qubits, you only need a handful to build useful communication networks. And we’re already well on the way. In a review article in Science, researchers from the University of Delft in the Netherlands outlined six phases of development towards a global network of quantum-connected quantum computers and point out that we’re already on the bottom rung of that ladder. “We are now at an exciting moment in time, akin to the eve of the classical internet,” the researchers wrote. “Recent technological progress now suggests that we may see the first small-scale implementations of quantum networks within the next five years.” The main advantages of a quantum communication network over a conventional one are speed and security. Entanglement makes it possible to communicate instantly across arbitrarily large distances in principle. No matter how far apart you put two entangled qubits, acting on one will have an instant and measurable impact on the other. It’s also essentially impossible to eavesdrop on a quantum conversation. Under quantum mechanics, if you read the quantum state of an object it changes that quantum state, which means the act of intercepting any message encoded in quantum states will immediately change the content of the message. But the same property that makes quantum communication intrinsically secure also poses a major challenge. It means qubits can’t be copied or amplified, two essential ingredients of classical communication systems. Nonetheless, working quantum “trusted repeater networks” are already in operation, which the researchers identify as the first step on the way to a full quantum internet. These networks feature nodes that can encode and decode qubits, which are then sent across optical cables or potentially beamed down from space by a satellite. But because quantum signals degrade the further they travel, it’s necessary to pass messages from node to node to cover longer distances. Each of these handovers is secure, but if two distant nodes need to communicate, then all the nodes in between know the content of the message, and so must be trusted if the message is to remain secure. To reach the next stage we will need to develop reliable quantum repeaters, the researchers said. This is a device that is able to establish entangled qubits with each node and then rely on quantum teleportation to effectively swap entanglements around so that the two nodes are entangled. A network connected by these kinds of repeaters would allow any node to securely communicate with any other without having to trust any of the intermediaries. At both these stages, the principle use would be quantum key distribution, which allows two nodes to securely share an encryption key in a way that can’t be eavesdropped on, which can then be used to decode encrypted messages sent via conventional communication channels. The process of entangling distant qubits is hit and miss at the minute, though, so the next stage will be to create a network that’s able to create entanglements on demand. The main advantage of this kind of “entanglement distribution network” is that it will make the network device-independent,according to the researchers. After that, the development of quantum memory will allow much more complicated communication protocols that require quantum information to be stored while further communication goes on. This is a major challenge, though, because quantum states rapidly degrade through a process called decoherence. Most technology proposals only hold their states for seconds or fractions of a second, which poses problems for a network whose communication times are longer than that. But if it could be realized, it would make it possible for simple quantum nodes to send computations to a quantum computer on the network, potentially creating a kind of quantum cloud. It could also make it possible to do things like synchronize distant telescopes to create a single “super telescope.” Ultimately, the goal is to create a network of fully-connected quantum computers. The first phase of that will be a “few-qubit fault-tolerant network,” in which the quantum computers at each node will not yet be large enough to out-do standard computers. Nonetheless, the fact that they incorporate fault tolerance will mean they will carry out relatively complex computation and store quantum data for significant amounts of time. And the final stage will come when these quantum computers finally surpass their conventional cousins, making it possible to create distributed networks of computers capable of carrying out calculations that were previously impossible, and instantly and securely share them around the world. The authors noted that there’s a long road ahead. We need better ways of encoding, storing, and transmitting quantum information, and perhaps even more importantly, we need to build quantum equivalents of our internet communication protocols, something almost entirely lacking today. But they’re bullish that the first multimode quantum networks will be appearing in the next few years, which will make it possible to test all these ideas and hopefully turbocharge development of a true quantum internet.
<urn:uuid:9e444d7b-6401-478e-8a4a-9935bae0e7f5>
CC-MAIN-2019-26
http://freedomandsafety.com/en/content/blog/quantum-computing-quantum-internet-roadmap
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628001138.93/warc/CC-MAIN-20190627115818-20190627141818-00086.warc.gz
en
0.918917
1,159
3.5
4
For many of us, the fast-evolving pace of technology means we are increasingly surrounded by one black box after another. We may have a rudimentary idea of how things work, but not enough to do anything more than understand the instructions. To really understand, you have to be able to open up the black box, actually see how it works and then put it back together. The latter point is one reason I leave my car engine well alone. The same goes for my laptop – I’m happy to leave that to the techies and the coders. But even if I was computer savvy, how am I supposed to get my head around the quantum computer revolution heading our way, when it’s impossible to look inside while it’s running? One of the many quirks of quantum computing is that it relies on the strange interaction of atoms and subatomic particles, and then there’s the small matter that the whole fragile quantum state collapses once you try and look at what’s actually going on. Seeing quantum computing in action “A quantum computer is the ultimate black box,” smiles quantum physicist Professor Lloyd Hollenberg who heads the University of Melbourne’s first ever course on quantum computer programming. He’s smiling because even after just 15 minutes, he is pleased a dullard like me is starting to show some rudimentary understanding of how quantum computing works. And it’s all thanks to a quantum computer simulator he and his colleagues have developed that basically lets you operate a quantum computer... with the lid off. “To see how a quantum computer works you want to crack it open, but in the process, you collapse the quantum state, so what do you do? Our simulator was designed to solve that problem, and in terms of its ease of use and what it tells you, it’s unique,” says Professor Hollenberg. There are already opportunities for people to access online a few of the small-scale quantum computers that have so far been developed, but generally programmers will only get back the final output from their coding – they won’t be able to ‘see’ how it works. It’s this ability to see inside that Professor Hollenberg says is crucial to help students learn by actually doing, and for professionals to debug their quantum code. qubits and pieces The simulator – the Quantum User Interface or QUI – is software that lets you click and drag logic instructions that operate on quantum bits (known as qubits) in order to write a program. A remote cluster of computers at the University runs the program on a simulated quantum computer and sends back the results in real time so the user can inspect and visualise all aspects of the quantum computer’s state at every stage in the program. A qubit is simply the quantum version of a classical computer ‘bit’ – the basic unit of computer data that exists in one of two states, which in programming we know as 0 or 1. This is the basis of all computer coding, and in a classical computer the 0s or 1s are usually represented by the different voltages that run through its transistor. But in a quantum computer the bits, or qubits, are quantum objects, like an electron in an atom, which can be in one of two states that we can likewise label for convenience as 0 or 1. What these quantum objects actually are varies across different quantum computer systems, but for the computer programmer that isn’t so important. What is important is the 0 and 1 generated by the quantum objects enables us to use the objects for coding. Getting past the weird physics What’s different about qubits is that because of the weird physics that exists at the atomic scale, each qubit can be in an unresolved ‘quantum superposition’ state of 1 and 0. When observed (which, remember, collapses the quantum state) each will have some probability of being 0 or 1, depending on how the quantum superposition was formed. Qubits can also be made to influence each other through a property called entanglement so that if a qubit is resolved as 0, another might automatically be 1. It is these peculiarities of quantum physics that promise to make quantum computers much more powerful than classical computers and, therefore, able to address difficult problems – like optimising complex routes or systems from weather forecasting to finance, designing new materials, or aiding machine learning. Unlike a classical computer that laboriously computes all possibilities before finding the right answer, a quantum computer uses the wave-like properties of data structures and numbers to narrow down the probability of the correct answer for problems that, theoretically, our current computers have no hope of matching. For the students accustomed to classic computer programming – it’s like learning from scratch. “You have to think really differently with quantum programming,” says electrical engineering student Fenella McAndrew, who is doing the course. “We’re basically going back to the start of the programming process, like working at the level of one circuit in conventional programming.” And the rules of how numbers are processed in a quantum computer is, for the uninitiated, mind boggling. “Teaching the material is a challenge, especially without a quantum mechanics background,” says Professor Hollenberg. “But there is a clear demand from students and professionals to learn more about the technology and get themselves ‘quantum ready’.” It was this need to make quantum computing more accessible to people with no physics background that was the genesis of developing QUI. The system allows programmers to see each phase of the operation and exactly what the quantum computer is doing – in particular how the quantum data is being manipulated to produce the output of the program. This is critical information for a would-be quantum programmer to understand. For students grappling with quantum theory that even experts struggle to explain, it’s reassuring to get started using the QUI and actually see quantum computing at work. “Quantum software design is such a new concept, and it feels more abstract than conventional computer coding, so being able to see what is happening when we design a function really helps,” says Daniel Johnston, a maths student taking the course. SOLVING PROBLEMS IMMEDIATELY Professor Hollenberg’s co-teacher, physicist Dr Charles Hill, says QUI means students are actually doing quantum computing themselves from the outset of the course. “People learning quantum computing need to understand how bits of information are manipulated with the unique rules that are different from classic computing, and then write their programs in a way that solves the problem they are looking at. “We’ve found that the QUI is easy and intuitive to use. It takes the beginner no more than five minutes to get started with the system and writing programs,” Dr Hill says. According to Professor Hollenberg, further versions of QUI are now in the works. “We are working on improvements and add-ons which will not only enhance the user experience, but also the uptake of the system more broadly in teaching and research,” he says. Budding quantum software programmers, or indeed anyone interested in quantum computing, can view the introductory video and try the system at QUIspace.org. In addition to Professor Hollenberg and Dr Hill, the QUI development team comprised Aidan Dang, Alex Zable and Matt Davis; IT experts Dr Melissa Makin and Dr Uli Felzmann; and research students Sam Tonetto, Gary Mooney and Greg White. Banner Image: Peaks of probabilities associated with data in a quantum algorithm evolve through time in a complicated way. The example pictured is part of a simulation of a quantum computer finding the prime factors of a number using Shor’s Algorithm. Picture: Matthew Davis, Gregory White and Aidan Dang
<urn:uuid:bdfe92b5-45a1-40b3-ba33-a10bf629af67>
CC-MAIN-2019-26
https://pursuit.unimelb.edu.au/articles/lifting-the-lid-on-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999263.6/warc/CC-MAIN-20190620165805-20190620191805-00047.warc.gz
en
0.935966
1,650
3.625
4
Click the table of contents to start reading. Learn Quantum Computing with Python and Q# demystifies quantum computing. Using Python and the new quantum programming language Q#, you’ll build your own quantum simulator and apply quantum programming techniques to real-world examples including cryptography and chemical analysis. A great introduction to the exciting new world of quantum computing. Part 1: Getting Started with Quantum 1 Introducing Quantum Computing 1.1 Who This Book is For 1.2 Who This Book is Not For 1.3 How this book is organized 1.4 Why does quantum computing matter? 1.5 What Can Quantum Computers Do? 1.6 What is a Quantum Computer? 1.6.1 How will we use quantum computers? 1.6.2 What can’t quantum computers do? 1.7 What is a Program? 1.7.1 What is a Quantum Program? 2 Qubits: The Building Blocks 2.1 Why do we need random numbers? 2.2 What are Classical Bits? 2.2.1 What Can We Do With Classical Bits? 2.2.2 Abstractions are our friend 2.3 Approaching Vectors 2.4 Seeing the Matrix for Ourselves 2.4.1 Party with inner products 2.5 Qubits: States and Operations 2.5.1 State of the qubit 2.5.2 The game of Operations 2.5.3 Measuring Qubits 2.5.4 Generalizing measurement: basis independence 2.5.5 Simulating qubits in code 2.6 Programming a Working QRNG 3 Sharing Secrets With Quantum Key Distribution 4 Nonlocal Games: Working With Multiple Qubits 5 Teleportation and Entanglement: Moving Quantum Data Around Part 2: Programming Quantum Algorithms In Q# 6 Changing the odds: An introduction to Q# 6.1 Introducing the Quantum Development Kit 6.2 Functions and Operations in Q# 6.3 Passing Operations as Arguments 6.4 Playing Morgana’s Game in Q# 7 What is a Quantum Algorithm? 8 Quantum Sensing: Measuring At Very Small Scales Part 3: Applied Quantum Computing 9 Computing Chemistry Problems With Quantum Computers 10 Searching Databases With Quantum Computers 11 Arithmetic With Quantum Computers Appendix A: Installing Required Software A.1 Installing a Python Environment A.1.1 Installing Anaconda A.1.2 Installing Python packages with Anaconda: QuTiP A.2 Installing the Quantum Development Kit A.2.1 Installing the .NET Core SDK A.2.2 Installing the Project Templates A.2.3 Installing the Visual Studio Code extension A.2.4 Installing IQ# for Jupyter Notebook A.2.5 Installing the qsharp Python package About the TechnologyQuantum computing is the next step in computing power and scalability, with the potential to impact everything from data science to information security. Using qubits, the fundamental unit of quantum information, quantum computers can solve problems beyond the scale of classical computing. Software packages like Microsoft's Quantum Development Kit and the Q# language are now emerging to give programmers a quick path to exploring quantum development for the first time. About the bookLearn Quantum Computing with Python and Q# demystifies quantum computing. Using Microsoft’s Quantum Development Kit to abstract away the mathematical complexities, this book builds your understanding of quantum computers by actively developing for them. You’ll start by learning QC fundamentals by creating your own quantum simulator in Python. Soon you’ll move on to using the QDK and the new Q# language for writing and running algorithms very different to those found in classical computing. When you’re finished you’ll be able to apply quantum programming techniques to applications like quantum key distribution, and tackle real-world examples such as chemistry simulations and searching unsorted databases. - The underlying mechanics of how quantum computers work - How to simulate qubits in Python - Q# and the Microsoft Quantum Developer Kit - How to apply quantum algorithms to real-world examples About the readerNo academic experience of quantum computing is required. A reader will need basic programming skills and some experience of linear algebra, calculus and complex numbers. About the authors Christopher Granade completed his PhD in physics (quantum information) at the University of Waterloo’s Institute for Quantum Computing, and now works in the Quantum Architectures and Computation (QuArC) group at Microsoft. He works in developing the standard libraries for Q# and is an expert in the statistical characterization of quantum devices from classical data. Previously, Christopher helped Scott Aaronson prepare lectures into his recent book, Quantum Computing Since Democritus. Sarah Kaiser completed her PhD in physics (quantum information) at the University of Waterloo’s Institute for Quantum Computing. She has spent much of her career developing new quantum hardware in the lab, from satellites to hacking quantum cryptography hardware. Communicating what is so exciting about quantum is her passion, and she loves finding new demos and tools to help enable the quantum community to grow. When not at the keyboard, she loves kayaking and writing books about engineering for kids. placing your order...Don't refresh or navigate away from the page.
<urn:uuid:e9a31840-f3eb-4416-9dd7-abd4e9e0506c>
CC-MAIN-2019-26
https://www.manning.com/books/learn-quantum-computing-with-python-and-q-sharp?utm_source=libhunt&utm_medium=web&utm_campaign=libhunt_learnquantumcomputingwithpythonandqsharp&utm_content=promo
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627997731.69/warc/CC-MAIN-20190616042701-20190616064701-00287.warc.gz
en
0.82789
1,143
3.53125
4
by Alexandru Gheorghiu (University of Edinburgh) and Elham Kashefi (University of Edinburgh, CNRS) Quantum computers promise to efficiently solve not only problems believed to be intractable for classical computers, but also problems for which verifying the solution is also intractable. How then, can one check whether quantum computers are indeed producing correct results? We propose a protocol to answer this question. Quantum information theory has radically altered our perspective about quantum mechanics. Initially, research into quantum mechanics was devoted to explaining phenomena as they are observed in nature. But the focus then changed to designing and creating quantum systems for computation, information processing, communication, and cryptography among many other tasks. In particular, what became clear was that quantum interference - “the heart of quantum mechanics”, as Richard Feynman described it - can be harnessed for quantum computation. Algorithms running on a hypothetical quantum computer would be able to solve problems by creating an interference pattern of different computational branches. This can lead to an exponential saving in the amount of resources used by a quantum algorithm, when compared to the best known classical algorithms. The most famous example of this is Shor's algorithm for factoring numbers which is exponentially faster than the best known classical factoring algorithms. But having a device which can solve problems exponentially faster than classical computers raises an interesting question: can a classical computer efficiently verify the results produced by this device? At first, one might be tempted to dismiss this question and say that as long as each component of a quantum computer has been tested and works correctly, there is no need to worry about the validity of the device's results. However, the point of verification is much more profound. Quantum computers would provide one of the most stringent tests of the laws of quantum mechanics. While numerous experiments involving quantum systems have already been performed to a remarkable precision, they all utilized relatively few degrees of freedom. But when many degrees of freedom are involved, and because predicting the outcome of the experiment requires exponential resources, it quickly becomes infeasible to calculate the possible results of the experiment without resorting to lax approximations. Verification of quantum computation would therefore allow for a new test of quantum mechanics, a test in the regime of high complexity. There is another important reason for verifying quantum computations, having to do with cryptography. The first quantum computers are likely to be servers, to which clients can connect through the Internet. We can already see an instance of this with the recent 5-qubit and 16-qubit devices that IBM has made available to the general public [L1]. When larger devices become available, users will wish to delegate complex computations to them. However, in such a distributed environment, malicious agents might perform man-in-the-middle attacks or compromise the remote server. The clients would then need a means to check the validity of the server's responses. In fact, in this setting, users might also wish to keep their data hidden even from the quantum computer itself, as it might involve sensitive or classified information. So can one verify quantum computations while also maintaining the secrecy of the client's input? The answer is yes. In fact, the client's ability to keep the input hidden is what makes verification possible. This was shown by Fitzsimons and Kashefi when they proposed a verification protocol based on a cryptographic primitive known as Universal Blind Quantum Computation (UBQC) [1,2]. In UBQC, a client that can prepare single qubits has the ability to delegate quantum computations to a server, in such a way that the server is oblivious to the computation being performed. To do verification, the client can then exploit this property by embedding tests in the computation, referred to as traps, which will fail if the server doesn't perform the correct computation. Of course, the problem with this approach is that the client needs to trust that the qubit preparation device works correctly and produces the specified states. But if, prior to the start of the protocol, a malicious agent corrupts the preparation device, the client could later be tricked into accepting incorrect results. To address this issue, we, together with Dr. Petros Wallden, at the University of Edinburgh, proposed a verification protocol which is device-independent . In other words, the client need not trust any of the quantum devices in the protocol. This is achieved by using a powerful result of Reichardt, Unger and Vazirani, known as rigidity of non-local correlations . Non-local correlations are correlations between responses of non-communicating parties that cannot be reproduced classically, unless the parties are allowed to communicate. Such correlations can be produced, quantum mechanically, through a suitable strategy for measuring certain entangled states. The rigidity result is essentially a converse to this. It states that certain non-local correlations can only be produced by a particular, unique strategy. Observing such correlations between non-communicating devices then implies that the devices are behaving according to this fixed strategy. What is remarkable about this result is that it only requires examining the outputs of the devices, without assuming anything about their inner workings. The protocol then works as follows: the client has an untrusted device for measuring single qubits and is also communicating classically with the quantum server. By examining the outputs of the two devices, it follows from the rigidity result that the client can check whether the two devices are sharing entanglement and performing measurements as instructed. If so, the client leverages this and uses the entanglement to remotely prepare single qubit states on the server's side. Finally, the client uses the trap-based scheme of Fitzsimons and Kashefi to delegate and verify an arbitrary quantum computation to the server. Figure 1: Device-independent verification protocol. The client, or verifier, will instruct both the measurement device and the server to measure entangled qubits. The statistics of these measurements are then checked by the verifier. All communication with the quantum devices is classical. Verification is an important milestone on the road to scalable quantum computing technology. As we have seen, verification protocols exist even for the most paranoid users. But even so, questions still remain regarding their optimality, their ability to tolerate noise and imperfections, as well as other issues. Addressing all these questions is a key challenge for both theorists and experimentalists and their resolution will shape the landscape of the emerging Quantum Internet. A. Broadbent, J.F. Fitzsimons, E. Kashefi: “Universal blind quantum computation”, in Proc. of FOCS ‘09, IEEE Computer Society (2009) 517 – 526. J.F. Fitzsimons, E. Kashefi: “Unconditionally verifiable blind quantum computation”, Phys. Rev. A 96 (2017) 012303. A. Gheorghiu, E. Kashefi, P. Wallden: “Robustness and device independence of verifiable blind quantum computing”, New Journal of Physics 17(8) (2015) 083040. B.W. Reichardt, F. Unger, U. Vazirani: Classical command of quantum systems. Nature 496(7446) (2013) 456. Elham Kashefi, University of Edinburgh, UK and CNRS, France University of Edinburgh, UK
<urn:uuid:0be65c15-cf38-4f53-8cb5-12cab097a0e2>
CC-MAIN-2019-26
https://ercim-news.ercim.eu/en112/special/keeping-quantum-computers-honest-or-verification-of-quantum-computing
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998475.92/warc/CC-MAIN-20190617123027-20190617145027-00332.warc.gz
en
0.930731
1,526
3.5
4
Industrial Revolution had started in 1784 AD in Britain. It is marked as an end of medieval era and start of modern era. It also means end of Human hard-work and start of machine work. It affected almost every aspects of human life. Starting from Britain it spread in whole world. Until that time only wood was primary source of energy but coal took over and generation of energy and its requirement of it increased. A first Steam engine was developed that could pump water from below ground to enable mining of coal deep down. With steam engine development of steam trains, steam powered pumps and machine were invented. The first modern water powered cotton spinning mill factory was established by Richard Arkwright in 1774 in Comford in Derbyshire with around 200 workers (today it is UNESCO world heritage point). People started to migrate to live near newly formed industries, crowding the area for supply of labourers and this is how modern urban areas developed. Huge sum of money was invested in making canals, production of pig iron with production of coal was done, it is also known as Canal Mania. Development of residential places, canals, Railways, roads, etc. took place. Britain’s first railway line opened in 1825 built by Stockton and Danlington. In 1840s railways building was on its high peak which is also known as Railway Mania. This was all about Industrial Revolution 1. After all advancement towards mechanical industrial revolution there was Technological revolution also known as Industrial Revolution 2. It started at the latter part of the 19th century. Due to industrial revolution there was mass production of cloth, goods, and surplus amount of agriculture products productions. It also changed the global trade. In this technological revolution there was new innovation in steel production, electricity and petroleum which made way for introduction of automobile and then planes. During this period manufacturing and production method were improved for example production of steel which replaced iron and it was strong and cheap price and which made possible to build rail lines at cheap price and it spread transportation, facilated the construction of ships, skyscrapers and larger bridges. In today’s world you won’t be able to imagine a world without electricity but till now it was a norm and the brilliant idea of electricity was invented. Edison made its first light bulb working on electricity and commercial first bulb generated in 1870s. And then Mosley Street and Saroy theatre set the stage for the first large scale power station. Second industrial revolution rolled around electricity and many component needed for it was too invented for example stepping down high voltage alternating current by Sebastian de Ferranti and with it he also enabled the assembly line and mass production. Now with technology and electricity in Second Industrial Revolution it made way for distant communication and hence fruitfully in 1875 Alexander Graham Bell invented the telephone and many years after Guglielmo Marconi in 1901 sent radio waves across the Atlantic Ocean for the first time in History. Many products on which are now life is depended was invented in this era like current paper machine, steam driven rotary printing press, etc. also appeared. This revolution lasted for around 100 years and now it was time for Industrial Revolution 3 which was started around 1960s. We are surrounded by technology, electronics, Internet, etc. and now it has become part of our life but when all these technology got invented? Yes it was Industrial Revolution 3. We can also called it as Digital Revolution as it brought semiconductors, main frame computing, personal computer, Internet, etc. Now which was analogy now became digital example the old television which used to tune with the antenna now got replaced by an Internet connected tablet. With digitalization automation took place in industries, further reducing man’s hard work. Electronics and information technology began to automate production and take supply chain global. Everything which was paper based work beamed computerised. We can also say Industrial Revolution 3 gave birth to new field that was IT sector. It was also the point where people life style changed with Internet, phone, electronics, automation, etc. Now a days developing countries are still going through industrial revolution 3. In 1980s only 1% of the world’s information was in digital format and now in 2019 it is around 99% in digital format. Now in 21st century we all are going through industrial revolution 4 which can also be said Artificial Intelligence revolution. Now we are breaking the walls between the digital, physical and biological spheres known as cyber physical systems. It’s breakthrough in a numerous of field including robotics, artificial intelligence, quantum computing, nanotechnology, biotechnology, Internet of things, 3D printing, autonomous vehicle and 5th generation wireless technology, advance genetically knowledge, etc. We all are in 4th wave of industrial Revolution. We are all going to witness human success in science and technology. The change world has seen in these 200 years is unimaginable. No one ever believed that we would fly, travel 100 times faster than horse, connect with anyone anywhere anytime in fraction of seconds, we would ever touch moon, and know secrets of universe, nanotechnology, biotechnology, etc. in just 200 years. But this haphazard development had led to many problems like pollution, degradation of earth, creation of new problems like health, cyber security, etc. May be the next advancement in this revolution would be to make earth pollution free and to solve many problems, etc.
<urn:uuid:ae34855a-86f6-4db3-8c99-6c1970a8a824>
CC-MAIN-2019-26
https://aissmsioit.org/industrial-revolution/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998475.92/warc/CC-MAIN-20190617123027-20190617145027-00332.warc.gz
en
0.971864
1,092
3.5625
4
The first ever programmable computer was an electro-mechanical hybrid of a machine put together in his parents’ front room in the late 1930s by German wunderkind Konrad Zuse. By the first half of the 1940s a fully digital machine, the Atanasoff-Berry Computer, had been achieved by the two scientists whose names the model was given. The two were scientists working at the Iowa State University. It took another several decades before computers had become both efficient and cheap enough to become a norm in first workspaces and then homes. Over the 25 years since the economy and much of the entertainment sector was digitalised, there has been a rapid acceleration in the pace of development. The latest technology in the world of computer hardware and software is able to achieve some quite remarkable feats. Even the smartphones we carry in our pockets contain more powerful processors than the laptops and PCs of ten years ago. We’ve undoubtedly come a long, long way since the computers of Zuse, Berry and Atanassof. We’re on the verge of AI software algorithms taking control of driverless cars and creating virtual reality worlds some are forecast to earn their primary income in within a decade or so. But despite the wonders modern software developers and hardware engineers are achieving at a seemingly ever accelerating rate, today’s most powerful computers and programmes retain one key similarity to Zuse’s first computer. Computers are all still programmed using exactly the same basic building blocks of binary code. Software code is, and always has been, represented by strings of ‘1’ and ‘0’. If you know where to look, the code that CERN scientists in Switzerland use to crunch the data produced by the Hadron Collider, the photos on your smartphone, the PowerPoint Presentation you made at work last week and your favourite computer game are all just endless streams of ‘0’ and ‘1’. These digits are represented by tiny circuits that switch from off to on, representing ‘0’ and ‘1’ depending on their off or on position. These circuits are called ‘bits’. This binary system of bits has made the latest technology in the world possible and for most tasks we use computing for is more than enough. However, for the most complex tasks we are now trying to compute, this binary system is beginning to prove to be a bottleneck. Some problems, such as modelling traffic flows, weather systems or the human genome become exponentially harder with each variable added. Faster and faster classical processors allow algorithms to very quickly sift through all the possible variables in their quest to arrive at an answer or end result. However, the sheer scale of the potential variables in some cases, such as the examples provided, means that it takes too long for binary computers to arrive at an answer to be a practical solution. It is hoped the quantum processors will solve that bottleneck. So What is Quantum Computing? Quantum mechanics is the fundamental theory of physics that explains nature by breaking it down to the smallest possible level – the energy inherent in atoms and sub-atomic particles. Quantum computing is made possible by scientists being able to use electrons as quantum bits, or qubits. Qubits are distinct from classical bits not only by their microscopic size but by their very nature. Electrons exist simultaneously in two states at once. Or rather, they exist between two states to varying degrees. It’s something that even the greatest minds in theoretical physics grapple with but in the simplest possible terms, by maintaining qubits at an extremely cold temperature and using magnetism to manipulate their polar field, they can simultaneously represent both a ‘1’ and ‘0’ at the same time. Rather than the string sequence of ‘1’s and ‘0’s of binary code, one qubit, which is both ‘1’ and ‘0’ can speak to two more qubits, who also speak to two more qubits each, which almost instantaneously results in countless numbers of qubits working together to process information. This means, in theory, qubits are able to run an indeterminate set of processes simultaneously and at a spectacular speed. This exponentially increases the processing power of a theoretical quantum computer system. At What Stage is Quantum Computing Technology? So that’s the theory but at what stage of development is quantum computing technology practically? Computer processors that have been proven to employ quantum phenomena do already exist. The tech big boys such as IBM, Microsoft and Alphabet are all working intensely on quantum computing R&D and one company, D-Wave Systems, is manufacturing commercially available quantum computing hardware. However, the technology is still in an early stage, is far from perfect and can’t, as yet, do anything that classical binary processors cannot. Nonetheless, it has already been demonstrated that these early quantum processors can solve problems that created specifically to take account of their current structure and limitations, far more quickly than classical processors. Microsoft’s Research Lab is predicting working quantum computers being widely available within a decade. Many other experts approximately agree, give or take a decade or so. Others are sceptical and think that quantum computing faces obstacles that mean it will never be genuinely practical. The biggest problem that researcher working on quantum processors face is keeping enough electrons in qubit state for long enough. It takes a huge amount of power to maintain the temperature and magnetic manipulation necessary for a qubit to come into existence for even a fraction of a second. When an electron loses it qubit state the information held in it is also lost unless passed on in time. This means the same information has to be held simultaneously on multiple qubits simultaneously and passed through the network quickly. It’s a bit like a movie scene where the hero or heroine is running across a crumbling bridge and has to reach the other side of a ravine before it crumbles from beneath their feet. The challenge is in creating processors that can maintain enough qubits in a connected state for long enough. D-Wave’s most recent working prototype 2000Q system is said to have made a ‘critical breakthrough’ in this respect. One has been bought and installed in the Quantum AI Lab run in partnership by Alphabet, NASA and the Universities Space Research Association. IBM also believes its research lab will succeed in connecting 50 qubits in a processor before the end of the year. The company has also managed to increase the period of time qubits exist to 100 microseconds and expects this to be multiplied by ten to a millisecond within 5 years. Within the next few years, quantum computing processors are expected to reach a ‘good enough’ approximation of the technology. ‘Good enough’ will mean a system that is based on quantum phenomena and, at least in certain applications, achieves processing speeds that classical binary processors cannot. The Future for Quantum Computing The ‘full’ quantum revolution is thought to still be many years away but plenty of encouraging breakthroughs are taking place. Key to increasing the stability of qubits may be recent achievements by a team of quantum scientists at Harvard University lead by Professor Kang-Kuen Ni. The team have, for the first time, succeeded in creating a ‘bespoke molecule’. Optical tweezers were used to manipulate single sodium and caesium atoms in an alloy that represents the first man-made molecule. The molecule also just happens to represent the optimal electrical constitution to be maintained in qubit status. We don’t yet really know how quickly developments in quantum computing technology may come about. It may be that, like many fields of science, the modern era will see us make leaps in years that previously took multiple decades. As an almost completely new science, quantum computing may also take the decades to develop that classical binary computing took before progress started to speed up. We may also hit a bottleneck that means quantum computing turns out to be a dead end. If we do get there though, and there are plenty of positive indicators that we may, quantum computing could be the most significant step yet towards humanity succeeding in stripping back the Veil of Maya to see the secrets of the universe revealed.
<urn:uuid:7b47a0a8-cc01-41b7-ad95-80792115293d>
CC-MAIN-2019-26
https://scommerce.com/what-is-quantum-computing-what-can-it-achieve-and-how-close-are-we/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627997731.69/warc/CC-MAIN-20190616042701-20190616064701-00293.warc.gz
en
0.951982
1,686
3.6875
4
The Natural History Museum of Utah recently announced that a recently found dinosaur fossil of Lythronax argestes is a new branch of the tyrannosaur family tree. It weighed 2 tons, and was over 24 feet long. Lythronax evolved over 10 million years before other tyrannosaurs, changing our understanding of dinosaur evolution. Lythronax was only one of a bunch of new dinosaur fossils discovered at Grand Staircase-Escalante National Monument. Metabolism. In a cellular context, it means the chemical reactions that happen in cells that help keep it alive. These reactions are fairly complex, and for a long time we thought that they could only happen inside cells, which kind of leads to a chicken or the egg kind of paradox. Now, scientists have found that it is relatively simple to have metabolic reactions happen outside of cells. RNA is used to make proteins. And you need these proteins to do things with RNA. But these experiments show that you don’t need RNA to get the metabolic reactions happening. They could have happened in the Earth’s early oceans. By starting with what we think the Earth’s early oceans would have, along with the starting chemicals for metabolic reactions, then heating it to 50° to 70° C for 5 hours, they were able to produce 29 different metabolic reactions. These included glycolysis and the pentose phosphate pathway, which are needed for production of ATP. This helps scientists understand abiogenesis, how life first started. It takes out the requirement for a cell to form with all of the necessary chemistry along with it out of whole cloth. The chemistry is capable of working before the first cell formed. The part we don’t understand yet is where the starting chemicals came from. We don’t know how they could have formed yet. But we’re getting closer. Scientists studying the brain have managed to grow neurons on petri dishes for a while, but they don’t connect the way real neurons do because the ones in a dish grow in a fundamentally 2D environment, and regular brains are fundamentally 3D. Now, researchers at Tufts University in Boston have made a 3D scaffold that allows neurons to connect more realistically. It has grey matter / white matter compartmentalization, which means that the structure is more similar to real brains. It can also last longer, up to two months in labs. This new tissue can let scientists study brain biology in more detail. They can see what happens to nearby cells when there is trauma. They can also see the effects of administering drugs more easily. Here’s a good Ted video on Quantum Mechanics, specifically, Quantum Entanglement. However, there are some things that you should keep in mind. I know that someone will say “sure I can tell if the cat is alive or not without opening the box. Just listen for the bomb.”. In the original thought experiment, there is a vial of poisonous gas, a Geiger counter, and a radioactive source. If the Geiger counter detects radiation, it will break the vial (killing the cat). After one hour, there is a 50% chance that this will happen. The bomb version is easier to understand, but you have to realize that you can’t detect if the bomb has exploded or not. Also, for the entanglement to work, you have to set things up very carefully to make the entanglement happen. You can’t just grab 2 atoms and have them be entangled. NASA has recently tested a new type of drive that may be used in future spaceships. The Cannae Drive is unique in that it doesn’t use propellant. Since propellant (fuel) has mass, normal drives need to move the spacecraft and the propellant for future thrust. This leads to needing lots of mass, frequently as much as the payload. But the Cannae Drive is different. It uses microwaves instead of propellant. By bouncing microwaves in a specially shaped container, they have managed to create a difference in radiation pressure, generating between 30-50 micronewtons. This is a very small amount of thrust. The only energy that is needed is electricity, which is readily available through solar panels. This technology is in its infancy, and is a long way from being used in spacecraft. I love this kind of thing because it appears to violate the Law of Conservation of Momentum (simpler). This means that we’re at the edge where our understanding of the way the universe works may be wrong. Our scientific understanding may have to change to account for this effect. Toxoplasma gondii is a single celled parasite that lives in a cat’s intestine. While it prefers felines, it can live in humans and other animals. In fact, about 1/3 of humans are hosts to it. Normally this isn’t a problem, but it is for people with suppressed immune systems. The interesting thing is that a human body’s reaction to T. gondii is similar to its response to cancer tumors. This leads to the idea that perhaps this parasite can be used as a cancer therapy. T. gondii stimulates the body’s immune system to fight cancer. While cancer can shut down the immune system, T. gondii stimulates it. Scientists have created a version of the parasite that can be grown in a lab, but can’t grow in animals/people. This may lead to an effective cancer drug that helps the body fight the disease.
<urn:uuid:583237e8-d765-4d82-b658-e42490e0a081>
CC-MAIN-2019-26
https://rileysci.com/2014/08/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999615.68/warc/CC-MAIN-20190624150939-20190624172939-00133.warc.gz
en
0.951003
1,154
3.75
4
Semiconductor Technology May Pave Way For Integrating Quantum Communications Into Standard Optical Channels Science Trends connects scientists and their research with a global audience. Join our community of 2,800+ science contributors. Since the 1920s, scientists have theorized ways to exploit the properties of quantum systems for communication purposes. By utilizing the strange properties of quantum entities and phenomena, like superpositions and entanglement, quantum communication channels could create genuinely unbreakable encryption protocols and provide computing power vastly superior to traditional computing methods. Now, a team of researchers has taken a significant step toward making the use of quantum communications a practical reality. In an article published October 1st in Nature, a group of scientists from the University of Groningen in the Netherlands reports that they have developed a reliable method to create entangled pairs of quantum particles at wavelengths close to those that are used by standard telecom providers. The new method relies on exploiting the structural defects of the semiconductor silicon carbide in optical fibers to produce qubits of information that can be transmitted via ordinary communication channels. Previous attempts at communicating using quantum systems have shown success, but most existing setups require custom hardware, as the produced qubits emit photons at wavelengths outside the range of that used standard optical channels. The discovery of this method is a significant step towards integrating quantum communications into everyday communication systems. Qubits And Semiconductors Standard digital computing systems store information in registers called “bits.” In digital computers, a bit can only take on one of two values; either a 1 or a 0 (hence the term binary). Quantum computing systems use “qubits,” a generalization of the classical bit. Qubits can take on three states, 1, 0, or a superposition of both 1 and 0. Essentially, a classical computer bit can exist in only one state at one time while a qubit can exist in multiple states at the same time. The most interesting property of a qubit is that it can store arbitrarily large amounts of classical information, so a quantum computer theoretically could perform tasks that would be impossible or take an extremely long time on a regular digital computer. To produce a qubit, first, the system has to make an entangled pair of photons. Two entangled particles will have certain values correlated no matter how far apart they are separated, so a measurement on one will instantaneously give you information about the state of the other particle. So, in order for a quantum communication channel to work, there needs to be a reliable way to create entangled pairs of particles. Additionally, these superpositions must be sufficiently isolated from the surrounding environment, as any small disturbance can decohere the superposition into a classical state. It is known that various transition-metal impurities in semiconductors will emit photons when struck by light. These impurities are known as “color-centers” and can affect the behavior of light. When light is shined on these impurities, excited electrons will jump to a higher energy state. When they fall back to their ground state, the electrons will emit the excess energy as a photon. For a material like silicon carbide with impurities of the metal molybdenum, the photons are emitted at an infrared wavelength close to that used in standard telecom communications channels. With this information in mind, the team began constructing their system. By using a procedure known as “coherent population trapping,” they were able to create superpositions of electrons in the color-centers of samples of silicon carbide with molybdenum impurities. These superpositions of electrons represented the qubits of information as entangled electrons will always have their spin values correlate. Using magnetic fields, the team was able to align the superpositions in whatever direction they desired. According to Ph.D. student Carmem Gilardoni, one of the researchers credited on the paper, “If you apply a magnetic field, the spins align either parallel or anti-parallel to the magnetic field. The interesting thing is that as a result, the ground state for electrons with spin up or spin down is slightly different.” Shining light on these electrons will make them fall back into one of two ground states and emit an entangled pair of photons. After some initial attempts, they managed to produce stable and long lasting superpositions at the color centers. The created superpositions showed optical life-cycles of 60 nanoseconds; long enough to extract useful information out of the quantum system. Most importantly, the entangled photons were emitted at a wavelength of ~1100 nm. Traditional optical communications channels use infrared wavelengths of ~1,300-1,500 nm and given the massive amount of knowledge of the effects of transition-metal impurities in semiconductors on the behavior of light, the team is confident that they can fine-tune their procedure to create photon pairs that are emitted at wavelengths that fall comfortably within those used in standard optical channels. One potential use of this technology would be to create genuinely unbreakable encryptions on communication channels. Superpositions are finicky entities and any disturbance can destroy a superposition by collapsing it into a definite state. If a person attempts to tap into a quantum communication channel to listen in on someone’s conversations, their external interaction with the channel will cause the quantum state to collapse. The result is that it is impossible to eavesdrop on two people who are communicating using a quantum channel, as any attempt to tap into the channel from the outside will decohere the state and destroy the original information. Other applications of quantum computing include an internet with unparalleled speeds and running simulations of quantum systems. Information stored in qubits can be accessed at dazzling speeds and a single qubit can hold infinitely more information than a classical bit. Though still in its infancy, the potential applications of this technology would completely change the face of modern technology as quantum communication channels could be easily integrated into existing communications hardware, ensuring that the benefits of quantum information systems would be readily available to as many people as possible.
<urn:uuid:c0fd874f-137e-45e2-8a7d-5abd9bd656b0>
CC-MAIN-2019-26
https://sciencetrends.com/semiconductor-technology-may-pave-way-for-integrating-quantum-communications-into-standard-optical-channels/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998716.67/warc/CC-MAIN-20190618103358-20190618125358-00180.warc.gz
en
0.920455
1,238
3.875
4
There's a lot of hype floating around the general computer industry, hype centered around one specific technology that has the potential to change everything: quantum computers. Being our company's namesake, we'll admit to a bias in our bullishness around this tech, and over the course of this final chapter of our Future of Computers series, we hope to share with you just why that is. At a basic level, a quantum computer offers an opportunity to manipulate information in a fundamentally different way. In fact, once this tech matures, these computers will not only solve mathematical problems faster than any computer currently in existence, but also any computer forecasted to exist over the next few decades (assuming Moore’s law holds true). In effect, similar to our discussion around supercomputers in our last chapter, future quantum computers will enable humanity to tackle ever larger questions that can help us gain a profoundly deeper understanding of the world around us. What are quantum computers? Hype aside, just how are quantum computers different than standard computers? And how do they work? For visual learners, we recommend watching this fun, short video from the Kurzgesagt YouTube team about this topic: Meanwhile, for our readers, we'll do our best to explain quantum computers without the need for a physics degree. For starters, we need to recall that the basic unit of information computers process is a bit. These bits can have one of two values: 1 or 0, on or off, yes or no. If you combine enough of these bits together, you can then represent numbers of any size and do all manner of calculations on them, on after the other. The bigger or more powerful the computer chip, the bigger the numbers you can create and apply calculations, and the faster you can move from one calculation to another. Quantum computers are different in two important ways. First, is the advantage of “superposition.” While traditional computers operate with bits, quantum computers operate with qubits. The superposition effect qubits enable is that instead of being constrained to one of two possible values (1 or 0), a qubit can exist as a mixture of both. This feature allows quantum computers to operate more efficiently (faster) than traditional computers. Second, is the advantage of “entanglement.” This phenomenon is a unique quantum physics behaviour that binds the destiny of a quantity of different particles, so that what happens to one will affect the others. When applied to quantum computers, this means they can manipulate all their qubits simultaneously—in other words, instead of doing a set of calculations one after another, a quantum computer could do them all at the same time. The race to build the first quantum computer This heading is somewhat of a misnomer. Leading companies like Microsoft, IBM and Google have already created the first experimental quantum computers, but these early prototypes feature less than two dozen qubits per chip. And while these early efforts are a great first step, tech companies and government research departments will need to build a quantum computer featuring at least 49 to 50 qubits for the hype to meet its theorized real-world potential. To this end, there are a number of approaches being experimented with to achieve this 50 qubit milestone, but two stand above all comers. In one camp, Google and IBM aim to develop a quantum computer by representing qubits as currents flowing through superconducting wires that are cooled to –273.15 degrees Celsius, or absolute zero. The presence or absence of current stands for a 1 or 0. The benefit of this approach is that these superconducting wires or circuits can be built out of silicon, a material semiconductor companies have decades of experience working with. The second approach, led by Microsoft, involves trapped ions held in place in a vacuum chamber and manipulated by lasers. The oscillating charges function as qubits, which are then used to process the quantum computer’s operations. How we will use quantum computers Okay, putting the theory aside, let’s focus on the real world applications these quantum computers will have on the world and how companies and people engage with it. Logistical and optimization problems. Among the most immediate and profitable uses for quantum computers will be optimization. For ride-sharing apps, like Uber, what's the fastest route to pick up and drop off as many customers as possible? For e-commerce giants, like Amazon, what's the most cost-effective way to deliver billions of packages during the holiday gift buying rush? These simple questions involve number crunching hundreds to thousands of variables at once, a feat that modern supercomputers just can't handle; so instead, they compute a small percentage of those variables to help these companies manage their logistical needs in a less than optimal way. But with a quantum computer, it will slice through a mountain of variables without breaking a sweat. Weather and climate modeling. Similar to the point above, the reason why the weather channel sometimes gets it wrong is because there are too many environmental variables for their supercomputers to process (that and sometimes poor weather data collection). But with a quantum computer, weather scientists can not only forecast near-term weather patterns perfectly, but they can also create more accurate long-term climate assessments to predict the effects of climate change. Personalized medicine. Decoding your DNA and your unique microbiome is crucial for future doctors to prescribe drugs that are perfectly tailored to your body. While traditional supercomputers have made strides in decoding DNA cost-effectively, the microbiome is far beyond their reach—but not so for future quantum computers. Quantum computers will also allow Big Pharma to better predict how different molecules react with their drugs, thereby significantly speeding up pharmaceutical development and lowering prices. Space exploration. The space telescopes of today (and tomorrow) collect enormous amounts of astrological imagery data each day that tracks the movements of trillions of galaxies, stars, planets, and asteroids. Sadly, this is far too much data for today's supercomputers to sift through to make meaningful discoveries on a regular basis. But with a mature quantum computer combined with machine-learning, all this data can finally be processed efficiently, opening the door to the discovery of hundreds to thousands of new planets daily by the early-2030s. Fundamental sciences. Similar to the points above, the raw computing power these quantum computers enable will allow scientists and engineers to devise new chemicals and materials, as well as better functioning engines and of course, cooler Christmas toys. Machine learning. Using traditional computers, machine-learning algorithms need a giant amount of curated and labeled examples (big data) to learn new skills. With quantum computing, machine-learning software can begin to learn more like humans, whereby they can pick up new skills using less data, messier data, often with few instructions. This application is also a topic of excitement among researchers in the artificial intelligence (AI) field, as this improved natural learning capacity could accelerate progress in AI research by decades. More on this in our Future of Artificial Intelligence series. Encryption. Sadly, this is the application that has most researchers and intelligence agencies nervous. All current encryption services depend on creating passwords that would take a modern supercomputer thousands of years to crack; quantum computers could theoretically rip through these encryption keys in under an hour. Banking, communication, national security services, the internet itself depends on reliable encryption to function. (Oh, and forget about the bitcoin as well, given its core dependence on encryption.) If these quantum computers work as advertised, all of these industries will be at risk, at worst endangering the entire world economy until we build quantum encryption to keep pace. Real-time language translation. To end this chapter and this series on a less stressful note, quantum computers will also enable near-perfect, real-time language translation between any two languages, either over a Skype chat or through the use of an audio wearable or implant in your ear. In 20 years, language will no longer be a barrier to business and everyday interactions. For example, a person who only speaks English can more confidently enter into business relationships with partners in foreign countries where English brands would have otherwise failed to penetrate, and when visiting said foreign countries, this person may even fall in love with a certain somebody who only happens to speak Cantonese.
<urn:uuid:b495b5de-b81b-49d7-b914-765c11d1f084>
CC-MAIN-2019-26
https://www.quantumrun.com/prediction/how-quantum-computers-will-change-world-future-computers
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998913.66/warc/CC-MAIN-20190619043625-20190619065625-00138.warc.gz
en
0.927392
1,703
3.9375
4
A Giant Quantum Leap Chinese scientists have established a quantum entanglement between particles 1200 kilometres apart, smashing the previous world record of 143 kilometres. In early 2016, China announced a successful transmission of “entangled” photon pairs from space to the Earth, which proves that quantum entanglement exists at a large distance. The result is a stepping stone to ultrasecure communication networks and, eventually, a space-based quantum internet. The study was published as a cover story by the U.S. journal Science on Friday. What is quantum entanglement? Quantum entanglement occurs when a pair of photons interact physically in the opposite manner at a large distance. The entanglement phenomenon also involves putting objects in the peculiar limbo of quantum superposition, in which an object’s quantum properties can occupy multiple states at once: both here and there, both dead and alive at the same time. And these multiple quantum states can be shared among multiple objects. So if entangled objects are separated, their precarious quantum states should remain linked until one of them is measured or disturbed. That measurement instantly determines the state of the other object, no matter how far away. It means the communication speed that is faster than light is possible, and one-day humans might also be able o communicate with one another over massive distances, and instantly. Such implication troubled Einstein, as it is in direct violation of his Relativity. However, as Chinese have long observed, comprehended and described, yin-yang balance is achieved constantly and consistently in motion, otherwise, the reality would collapse, while the same taichi core is a holographic existence in each piece of and at all levels in the universe. Quantum entanglement and quantum computer The basics of a quantum computer include processing data with ‘qubits’ (quantum particles), not a common binary system, which allows photons (particles of qubit) to exist in multiple states at the same time, so instead of being stored as either 0 or 1, it can be both 0 and 1. This can bring a quantum leap in computing speed. The unique, long distance and constant entanglement also point to the possibility of hack-proof communications. Usually, hackers perform cyber attacks by intervening within the system through transmitted signals. Quantum computer can be a game changer because the entanglement doesn’t require transmission. Further, two parties can exchange secret messages by sharing an encryption key encoded in the properties of entangled particles. Any eavesdropper would affect the entanglement and so be detected. Efforts to prove quantum entanglement Ever since the 1970s, physicists began testing the quantum entanglement effect. In 2015, a test, which involved measuring entangled electrons 1.3 kilometres apart, demonstrates that such a correlation is real. Yet efforts to entangle quantum particles, such as photons, have been limited to about 100 km, mostly because the entanglement is lost when transmitted along optical fibres or through open space. A Chinese satellite for quantum entanglement research Last August the Chinese Academy of Sciences put an experimental satellite into orbit. The satellite, with a design life of two years, was the world’s first satellite launched to do quantum experiments, which is officially known as Quantum Experiments at Space Scale (QUESS), also known as Mozi, after the Chinese philosopher 墨子 (470BC-391BC), believed to be the first person conducted optical experiments in the world. Central to QUESS’s experiments is a laser beam mounted on the Micius satellite. The beam was split to generate pairs of photons that share a common quantum state, in this case, related to polarization. The entangled photons were funnelled into two onboard telescopes that fired them at separate stations on the ground: one in Delingha in northwest China’s Qinghai Province and another in Lijiang in southwest China’s Yunnan Province. The team found that all pairs of the photons were still entangled. This proves that quantum communication at continental distances can be achieved. The next step in developing a global network is reportedly to test quantum key distribution and to establish a quantum connection between China and a ground station in Vienna. According to the Chinese research team, over the next five years, more satellites with capabilities identical to Micius are planned for launch. China is not the only country to perform such experiments but is the first to go that far. Teams from Canada, Germany, Austria, Singapore and other countries also have plans for quantum space experiments. COMMENTS FROM GOOGLE PLUS Lawrence Kedz (Jun 19, 2017) This one may take hours to merely establish an overview? I’ll try to find some time because science still tugs at my heart at times. But I had realized when I’d left, it was an all or nothing proposition. Thanks for the memories, and for showing me that I still believe I’ve made the right decision almost 20 years ago. All Things Chinese I’m truly truly excited by this breakthrough 😅 Quantum physics is a way to rescue modern science from becoming a religion, that places blind faith in our own selective, omissive, rigid and biased brain activity called “logical reasoning”, a single dimensional thinking path, and their external physical extensions.
<urn:uuid:9c5e41b5-c896-4f6d-9486-7d4486fa4e0a>
CC-MAIN-2019-26
https://www.viewofchina.com/quantum-leap/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000266.39/warc/CC-MAIN-20190626094111-20190626120111-00381.warc.gz
en
0.931929
1,114
3.65625
4
Materials by design: Argonne researchers use genetic algorithms for better superconductors. Owners of thoroughbred stallions carefully breed prizewinning horses over generations to eke out fractions of a second in million-dollar races. Materials scientists have taken a page from that playbook, turning to the power of evolution and artificial selection to develop superconductors that can transmit electric current as efficiently as possible. Perhaps counterintuitively, most applied superconductors can operate at high magnetic fields because they contain defects. The number, size, shape and position of the defects within a superconductor work together to enhance the electric current carrying capacity in the presence of a magnetic field. Too many defects, however, can lead to blocking the electric current pathway or a breakdown of the superconducting material, so scientists need to be selective in how they incorporate defects into a material. "When people think of targeted evolution, they might think of people who breed dogs or horses. Ours is an example of materials by design, where the computer learns from prior generations the best possible arrangement of defects." -- Argonne materials scientist Andreas Glatz. In a new study from the U.S. Department of Energy's (DOEArgonne National Laboratory, researchers used the power of artificial intelligence and high-performance supercomputers to introduce and assess the impact of different configurations of defects on the performance of a superconductor. The researchers developed a computer algorithm that treated each defect like a biological gene. Different combinations of defects yielded superconductors able to carry different amounts of current. Once the algorithm identified a particularly advantageous set of defects, it re-initialized with that set of defects as a "seed," from which new combinations of defects would emerge. "Each run of the simulation is equivalent to the formation of a new generation of defects that the algorithm seeks to optimize," said Argonne distinguished fellow and senior materials scientist Wai-Kwong Kwok, an author of the study. "Over time, the defect structures become progressively refined, as we intentionally select for defect structures that will allow for materials with the highest critical current." The reason defects form such an essential part of a superconductor lies in their ability to trap and anchor magnetic vortices that form in the presence of a magnetic field. These vortices can move freely within a pure superconducting material when a current is applied. When they do so, they start to generate a resistance, negating the superconducting effect. Keeping vortices pinned, while still allowing current to travel through the material, represents a holy grail for scientists seeking to find ways to transmit electricity without loss in applied superconductors. To find the right combination of defects to arrest the motion of the vortices, the researchers initialized their algorithm with defects of random shape and size. While the researchers knew this would be far from the optimal setup, it gave the model a set of neutral initial conditions from which to work. As the researchers ran through successive generations of the model, they saw the initial defects transform into a columnar shape and ultimately a periodic arrangement of planar defects. "When people think of targeted evolution, they might think of people who breed dogs or horses," said Argonne materials scientist Andreas Glatz, the corresponding author of the study. "Ours is an example of materials by design, where the computer learns from prior generations the best possible arrangement of defects." One potential drawback to the process of artificial defect selection lies in the fact that certain defect patterns can become entrenched in the model, leading to a kind of calcification of the genetic data. "In a certain sense, you can kind of think of it like inbreeding," Kwok said. "Conserving most information in our defect 'gene pool' between generations has both benefits and limitations as it does not allow for drastic systemwide transformations. However, our digital 'evolution' can be repeated with different initial seeds to avoid these problems." In order to run their model, the researchers required high-performance computing facilities at Argonne and Oak Ridge National Laboratory. The Argonne Leadership Computing Facility and Oak Ridge Leadership Computing Facility are both DOE Office of Science User Facilities. An article based on the study, "Targeted evolution of pinning landscapes for large superconducting critical currents," appeared in the May 21 edition of the Proceedings of the National Academy of Sciences. In addition to Kwok and Glatz, Argonne's Ivan Sadovskyy, Alexei Koshelev and Ulrich Welp also collaborated. Funding for the research came from the DOE's Office of Science. Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation's first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America's scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy's Office of Science. The U.S. Department of Energy's Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https:/ Chris Kramer | EurekAlert! 'Neural Lander' uses AI to land drones smoothly 27.05.2019 | California Institute of Technology New system by TU Graz automatically recognises pedestrians’ intent to cross the road 27.05.2019 | Technische Universität Graz Researchers from Sweden's Chalmers University of Technology and the University of Gothenburg present a new method which can double the energy of a proton beam produced by laser-based particle accelerators. The breakthrough could lead to more compact, cheaper equipment that could be useful for many applications, including proton therapy. Proton therapy involves firing a beam of accelerated protons at cancerous tumours, killing them through irradiation. But the equipment needed is so large and... A new assessment of NASA's record of global temperatures revealed that the agency's estimate of Earth's long-term temperature rise in recent decades is accurate to within less than a tenth of a degree Fahrenheit, providing confidence that past and future research is correctly capturing rising surface temperatures. The most complete assessment ever of statistical uncertainty within the GISS Surface Temperature Analysis (GISTEMP) data product shows that the annual values... Physicists at the University of Basel are able to show for the first time how a single electron looks in an artificial atom. A newly developed method enables them to show the probability of an electron being present in a space. This allows improved control of electron spins, which could serve as the smallest information unit in a future quantum computer. The experiments were published in Physical Review Letters and the related theory in Physical Review B. The spin of an electron is a promising candidate for use as the smallest information unit (qubit) of a quantum computer. Controlling and switching this spin or... Engineers at the University of Tokyo continually pioneer new ways to improve battery technology. Professor Atsuo Yamada and his team recently developed a... With a quantum coprocessor in the cloud, physicists from Innsbruck, Austria, open the door to the simulation of previously unsolvable problems in chemistry, materials research or high-energy physics. The research groups led by Rainer Blatt and Peter Zoller report in the journal Nature how they simulated particle physics phenomena on 20 quantum bits and how the quantum simulator self-verified the result for the first time. Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists... 29.04.2019 | Event News 17.04.2019 | Event News 15.04.2019 | Event News 27.05.2019 | Information Technology 27.05.2019 | Physics and Astronomy 27.05.2019 | Life Sciences
<urn:uuid:9ddfd606-973c-4993-b5eb-44544edad563>
CC-MAIN-2019-26
https://www.innovations-report.com/html/reports/information-technology/ai-and-high-performance-computing-extend-evolution-to-superconductors.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998879.63/warc/CC-MAIN-20190619003600-20190619025600-00261.warc.gz
en
0.91846
1,671
3.734375
4
Quantum computers use quantum bits or "qubits" to do their calculations - quantum states, that is, of atoms or electrons that can take on the logical values "0" and "1" at the same time. In order to wire up many such qubits to make a powerful quantum computer, one needs to couple them to each other over distances of millimetres or even several metres. One way of achieving this is by exploiting the charge displacement caused by an electromagnetic wave, which is the working principle of an antenna. Such a coupling, however, also exposes the qubit to disturbances due to unwanted electric fields, which severely limits the quality of the logical qubit operations. A team of scientists working in several research groups at ETH Zurich, assisted by theoretical physicists at Sherbrooke University in Canada, have now demonstrated how this problem can be avoided. To do so, they found a way to couple a microwave photon to a spin qubit in a quantum dot. Qubits with charge or spin In quantum dots, electrons are first trapped in semiconductor structures measuring just a few nanometres that are cooled to less than one degree above the absolute zero of the temperature scale. The logical values 0 and 1 can now be realized in two different ways. One either defines a qubit in terms of the position of the electron on the right or left side of a double quantum dot, or else by the spin of the electron, which can point up or down. The first case is called a charge qubit, which couples strongly to electromagnetic waves through the displacement of electric charge. A spin qubit, on the other hand, can be visualized as a tiny compass needle that points up or down. Much like a compass needle, a spin is also magnetic and, therefore, does not couple to electric but rather to magnetic fields. The coupling of a spin qubit to the magnetic part of electromagnetic waves, however, is much weaker than that of a charge qubit to the electric part. Three spins for stronger coupling This means that, on the one hand, a spin qubit is less susceptible to noise and keeps its coherence (on which the action of a quantum computer is based) for a longer period of time. On the other hand, it is considerably more difficult to couple spin qubits to each other over long distances using photons. The research group of ETH professor Klaus Ensslin uses a trick to make such a coupling possible nevertheless, as the post-doc Jonne Koski explains: "By realising the qubit with not just a single spin, but rather three of them, we can combine the advantages of a spin qubit with those of a charge qubit." In practice, this is done by producing three quantum dots on a semiconductor chip that are close to each other and can be controlled by voltages that are applied through tiny wires. In each of the quantum dots, electrons with spins pointing up or down can be trapped. Additionally, one of the wires connects the spin trio to a microwave resonator. The voltages at the quantum dots are now adjusted in order to have a single electron in each quantum dot, with the spins of two of the electrons pointing in the same direction and the third spin pointing in the opposite direction. Charge displacement through tunnelling According to the rules of quantum mechanics, the electrons can also tunnel back and forth between the quantum dots with a certain probability. This means that two of the three electrons can temporarily happen to be in the same quantum dot, with one quantum dot remaining empty. In this constellation the electric charge is now unevenly distributed. This charge displacement, in turn, gives rise to an electric dipole that can couple strongly to the electric field of a microwave photon. The scientists at ETH were able to clearly detect the strong coupling by measuring the resonance frequency of the microwave resonator. They observed how the resonance of the resonator split into two because of the coupling to the spin trio. From that data they could infer that the coherence of the spin qubit remained intact for more than 10 nanoseconds. Spin trios for a quantum bus The researchers are confident that it will soon be possible to realize a communication channel for quantum information between two spin qubits using this technology. "This will require us to put spin trios on either end of the microwave resonator and to show that the qubits are then coupled to each other through a microwave photon", says Andreas Landig, first author of the article and PhD student in Ensslin's group. This would be an important step towards a network of spatially distributed spin qubits. The researchers also emphasize that their method is very versatile and can straightforwardly be applied to other materials such as graphene. This work was carried out in the framework of the National Centre of Competence in Research Quantum Science and Technology (NCCR QSIT). At ETH Zurich, scientists in the groups of Klaus Ensslin, Thomas Ihn, Werner Wegscheider and Andreas Wallraff were involved in the research. Landig AJ, Koski JV, Scarlino P, Mendes UC, Blais A, Reichl C, Wegscheider W, Wallraff A, Ensslin K, Ihn T: Coherent spin-photon coupling using a resonant exchange qubit. Nature, 25 July 2018, doi: 10.1038/s41586-018-0365-y Prof. Dr. Klaus Ensslin | EurekAlert! Immortal quantum particles: the cycle of decay and rebirth 14.06.2019 | Technische Universität München Small currents for big gains in spintronics 13.06.2019 | University of Tokyo The well-known representation of chemical elements is just one example of how objects can be arranged and classified The periodic table of elements that most chemistry books depict is only one special case. This tabular overview of the chemical elements, which goes back to... Light can be used not only to measure materials’ properties, but also to change them. Especially interesting are those cases in which the function of a material can be modified, such as its ability to conduct electricity or to store information in its magnetic state. A team led by Andrea Cavalleri from the Max Planck Institute for the Structure and Dynamics of Matter in Hamburg used terahertz frequency light pulses to transform a non-ferroelectric material into a ferroelectric one. Ferroelectricity is a state in which the constituent lattice “looks” in one specific direction, forming a macroscopic electrical polarisation. The ability to... Researchers at TU Graz calculate the most accurate gravity field determination of the Earth using 1.16 billion satellite measurements. This yields valuable knowledge for climate research. The Earth’s gravity fluctuates from place to place. Geodesists use this phenomenon to observe geodynamic and climatological processes. Using... Discovery by Brazilian and US researchers could change the classification of two species, which appear more akin to jellyfish than was thought. The tube anemone Isarachnanthus nocturnus is only 15 cm long but has the largest mitochondrial genome of any animal sequenced to date, with 80,923 base pairs.... Researchers at Chalmers University of Technology, Sweden, have discovered a completely new way of capturing, amplifying and linking light to matter at the nanolevel. Using a tiny box, built from stacked atomically thin material, they have succeeded in creating a type of feedback loop in which light and matter become one. The discovery, which was recently published in Nature Nanotechnology, opens up new possibilities in the world of nanophotonics. Photonics is concerned with various means of using light. Fibre-optic communication is an example of photonics, as is the technology behind photodetectors and... 29.04.2019 | Event News 17.04.2019 | Event News 15.04.2019 | Event News 17.06.2019 | Information Technology 17.06.2019 | Earth Sciences 17.06.2019 | Ecology, The Environment and Conservation
<urn:uuid:e7a10259-b03a-4d25-9fdd-2e9d5ef683fe>
CC-MAIN-2019-26
https://www.innovations-report.com/html/reports/physics-astronomy/a-spin-trio-for-strong-coupling.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998513.14/warc/CC-MAIN-20190617163111-20190617185111-00223.warc.gz
en
0.915877
1,686
4.28125
4
On March 22, 1909, US-American physicist Nathan Rosen was born. He is best known for his cooperation together with Albert Einstein and Boris Podolsky on the quantum-mechanical description of physical reality leading the the so-called Einstein-Podolsky-Rosen paradoxon, as well as his postulation of worm holes connecting distant areas in space. Although purely theoretic, his work also had an important impact on science fiction literature. Nathan Rosen was born in New York City. He first studied electrical engineering (bachelor’s degree) and then physics (master’s degree 1929) at the Massachusetts Institute of Technology, where he received his doctorate in 1932 with the thesis “Calculation of Energies of Diatomic Molecules” under John C. Slater. During his time at the University, Rosen already published several papers on the explanation of an atomic nucleus‘ structure and on wave functions. He then became a National Research Fellow at the University of Michigan and Princeton University, where he studied theoretical molecular physics (model of the hydrogen molecule). However, he already wrote his master’s thesis on gravitational physics and contacted Albert Einstein at Princeton to get his opinion. The Assistant of Einstein Rosen started his assistance job to Albert Einstein in 1935, extending Einstein’s studies on wave functions, resulting in a publication together with Boris Podolsky. In the paper, the three scientists attempted to answer the question “Can quantum-mechanical description of physical reality be considered complete?“. The effects were then named the Einstein-Podolsky-Rosen paradox (EPR). The EPR paradox contains a thought experiment, attempting to reveal insufficiencies of quantum mechanics and indeed they at least proved the research on quantum mechanics at this state was incomplete. The Einstein-Podolsky-Rosen Paradoxon The essence of the paradox is that particles can interact in such a way that it is possible to measure both their position and their momentum more accurately than Heisenberg’s uncertainty principle allows , unless measuring one particle instantaneously affects the other to prevent this accuracy, which would involve information being transmitted faster than light as forbidden by the theory of relativity (“spooky action at a distance”). This consequence had not previously been noticed and seemed unreasonable at the time; the phenomenon involved is now known as quantum entanglement. According to quantum mechanics, under some conditions, a pair of quantum systems may be described by a single wave function, which encodes the probabilities of the outcomes of experiments that may be performed on the two systems, whether jointly or individually. The routine explanation of this effect was, at that time, provided by Heisenberg’s uncertainty principle. Physical quantities come in pairs called conjugate quantities. Examples of such conjugate pairs are (position, momentum), (time, energy), and (angular position, angular momentum). When one quantity was measured, and became determined, the conjugated quantity became indeterminate. Heisenberg explained this uncertainty as due to the quantization of the disturbance from measurement. After working for Einstein, Rosen was suggested to continue his work in Israel. Both scientists began focusing on wormholes after discovering a mathematical method for wormholes able to connect certain areas in space. These Einstein-Rosen bridges were found by mating the mathematical solutions of black holes and white holes through using Einstein’s field equations from 1915. The Einstein-Rosen bridges, also called Schwarzschild wormholes were completely theoretical, but John A. Wheeler and Robert W. Fuller proved these wormholes in 1962 to be unstable. A wormhole can be visualized as a tunnel with two ends, each at separate points in spacetime (i.e., different locations and/or different points of time), or by a transcendental bijection of the spacetime continuum. Wormholes are consistent with the general theory of relativity, but whether wormholes actually exist remains to be seen. A wormhole could connect extremely long distances such as a billion light years or more, short distances such as a few meters, different universes, or different points in time. According to general relativity, the gravitational collapse of a sufficiently compact mass forms a singular Schwarzschild black hole. In the Einstein–Cartan–Sciama–Kibble theory of gravity, however, it forms a regular Einstein–Rosen bridge. This theory extends general relativity by removing a constraint of the symmetry of the affine connection and regarding its antisymmetric part, the torsion tensor, as a dynamical variable. Torsion naturally accounts for the quantum-mechanical, intrinsic angular momentum (spin) of matter. The minimal coupling between torsion and Dirac spinors generates a repulsive spin–spin interaction that is significant in fermionic matter at extremely high densities. Such an interaction prevents the formation of a gravitational singularity. Instead, the collapsing matter reaches an enormous but finite density and rebounds, forming the other side of the bridge. Wormholes in Science Fiction However, wormholes not only fascinated scientists, also science fiction writers increased their interest in them. Numerous writers in literature, television and films used and still use wormholes to transport whole star ships or travel through time as in Star Trek’s movie from 2009 in which Spock and Nero use (fictional) red matter to build artificial black holes and travel back in time. Contrary to physics, there are no limits in science fiction and even in Star Trek, a completely stable wormhole near the planet Bajor can be found, unique also in the Star Trek universe. A notable science fiction novel is also ‘The Forever War‘ by Joe Haldeman from 1974. In the plot, interstellar travel is possible through collapsars, another word for black holes. The plot is leaned on the theory by Einstein and Rosen, claiming that there may be bridges located in the black holes. Rosen’s Later Life Rosen later was professor of theoretical physics at the University of Kiev (on Einstein’s recommendation) and from 1941 at the University of North Carolina at Chapel Hill before going to Israel, where he was professor at the Technion in Haifa from 1953 and founder of the Institute of Theoretical Physics there. He was temporarily head of the Physics Department and the Faculty of Nuclear Engineering there. In 1977 he became Distinguished Professor at the Technion. He was emeritus in 1979, but continued to teach gravitational physics at the Technion (as Gerard Swope Professor Emeritus) until 1991. In Israel, he was also involved in building the engineering education at Ben Gurion University in Be’er Scheva (1969-1971 he was Dean of Engineering there). Nathan Rosen died on December 18, 1995 in Haifa. References and Further Reading: - Nathan Rosen at New York Times Online - Nathan Rosen Biography - Wormholes at NASA’s Website - Albert Einstein revolutionized Physics, SciHi Blog, March 14, 2018 - James Chadwick and the Discovery of the Neutron, SciHi Blog, February 27, 2018 - Sir Arthur Eddington – The Man who Proved Einstein’s General Relativity, SciHi Blog, November 22, 2012 - The Annus Mirabilis in Physics – Albert Einstein and the Year 1905, SciHi Blog, June 30, 2012 - Albert Abraham Michelson and the Famous Experiment that lead to Einstein’s Special Relativity Theory, SciHi Blog, December 19, 2012 - Nathan Rosen at Wikidata - A life is like a garden – Leonard Nimoy, SciHi Blog, February 28, 2015. - Werner Heisenberg and the Uncertainty Principle, SciHi Blog, December 5, 2012. - Nathan Rosen at the Mathematics Genealogy Project - A. Einstein and N. Rosen, The Particle Problem in the General Theory of Relativity (PDF; 908 kB), Phys. Rev. 48, 73–77 (1935) - Timeline of Quantum Physics People, via DBpedia and Wikidata
<urn:uuid:3be3d81a-fc1b-46f5-bb16-2ed31d80ef48>
CC-MAIN-2019-26
http://scihi.org/nathan-rosen-wormholes-time-travel/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999853.94/warc/CC-MAIN-20190625152739-20190625174739-00425.warc.gz
en
0.946051
1,680
3.84375
4
The effort to build a fully functional quantum computer is one of the most exciting engineering challenges today. We hear of a new potential breakthrough almost every week that gets researchers closer to achieving this goal. But with every new breakthrough comes the question of how quantum technology will affect security. There are certainly reasons for concern. If a quantum computer were to appear today, virtually all internet communication would become insecure. Even if the technology emerges some years from now, it will still be able to open the secret communications of today. No matter how you cut it, quantum computing will have a profound effect on today’s security infrastructure, and organizations of all kinds would be wise to consider the security implications before it’s too late. Protocols and Primitives Cryptographic protocols, such as Secure Sockets Layer (SSL), Transport Layer Security (TLS) and Hypertext Transfer Protocol Secure (HTTPS), ensure that communication between two parties is authenticated and private. The building blocks of these protocols are various cryptographic primitives, such as authentication schemes (e.g., keyed-hash message authentication code, or HMAC), block ciphers (e.g., advanced encryption standard, or AES), digital signatures (e.g., Digital Signature Algorithm, or DSA) and encryption schemes (e.g., RSA). For the most part, protocols are constructed from primitives in a modular way. Therefore, if the primitives satisfy their respective security properties, so will the protocols. Cryptographic primitives can be divided into two classes: symmetric and asymmetric. The latter is often referred to as public key. Symmetric primitives assume the parties have a preshared secret key before beginning communication, whereas asymmetric primitives assume the parties begin with no common secret information. Most protocols employ the hybrid approach. The communicating parties first use public key primitives to secretly exchange a string before it switches over to the much faster symmetric key primitives, using this common string as the secret key. An Existential Threat? Quantum computers will affect symmetric and asymmetric primitives differently. According to Grover’s algorithm, quantum computers are able to brute-force search all 2n possible n-bit keys in 2n/2 time, which would require us to double the key sizes to maintain the same level of security. For the most part, this is all quantum computers can do against the symmetric primitives in use today. While certainly notable, this by itself does not pose an existential threat to symmetric cryptography. Public key primitives are a different story. Virtually all public key primitives used today require that either factoring large integers or finding discrete logarithms in finite groups is a hard problem. However, quantum computers can easily solve these problems using Shor’s algorithm. In other words, the bad news is that quantum computers break public key primitives. The good news is that the only primitives that really need fixing are digital signatures and public key encryption, because these are enough to construct virtually every critical internet security protocol. Once these primitives are in place, all the important protocols can be made quantum-safe. Constructing Quantum-Safe Primitives As it turns out, public key encryption and digital signature schemes that we believe to be quantum-safe have existed since the late 1970s, even before people were aware of the potential problems quantum computing would pose to cryptography. The problem with these constructions, however, is that they are very inefficient. Keys and/or messages are on the order of megabytes. One of the real breakthroughs in the field of quantum-safe cryptography in the past decade has been the development of new techniques that allow for extremely practical constructions. Lattice-based cryptography has become one of the most fruitful approaches for constructing primitives. Lattice-based cryptography is rooted in linear algebra. Suppose that one is given a square, full-rank matrix A and a value b=Ax mod p where x is a vector with 0/1 coefficients and p is a small (e.g., 10-bit) prime. One is then tasked with finding x. This problem has a unique solution, x, which is actually quite easy to find using Gaussian elimination. On the other hand, if one is given a slightly noisy version of Ax, that is Ax+e mod p, where e is some random vector with 0/1 coefficients, then for matrices of large-enough dimension (say, around 512), this problem becomes surprisingly difficult. This type of problem is related to both the subset sum and the learning parity with noise problems that have been widely studied since the 1980s and have not succumbed to any algorithmic attacks, either classical or quantum. Much of the past decade’s research focused on increasing our understanding of different versions of the problem described above and building schemes based on their presumed hardness. In my view, this research line has been a great success. Performancewise, the most efficient lattice-based encryption and signature schemes are much faster than those based on RSA, and have key and output lengths of a similar size. Researchers have also made exciting progress toward building cryptographic primitives, such as fully homomorphic encryption, which allows for evaluation of encrypted data whose only instantiations are based on the same linear-algebraic assumptions. Lattice-based cryptography provides fast, quantum-safe, fundamental primitives and allows for constructions of primitives that were previously thought impossible. This combination has established lattice-based cryptography as one of the most fascinating research fields with potential to become the backbone of real-world cryptography in the near future. Today’s Solution to Tomorrow’s Problem Can quantum-safe cryptography be used today? The short answer is yes. Lattice-based primitives are efficient and have already been successfully plugged into the TLS and Internet Key Exchange (IKE) protocols. But since quantum computers are not yet here, few security professionals are likely to abandon today’s cryptography for a different approach. After all, any new technology comes with growing pains, and IT professionals cannot afford to make even the smallest mistake when dealing with information security. A slow migration towards lattice cryptography is probably the best bet. An organization looking to future-proof its data should first use lattice cryptography in tandem with traditional primitives. This approach should secure the organization’s data as long as at least one of these constructions is secure. This would remove all the risk of introducing a new technology. If implemented correctly, all communication will be quantum-safe. All it takes is a couple of extra kilobytes of data per communication session. In short, quantum computers may be coming sooner than you think, so be ready!
<urn:uuid:34ad0476-92dc-4bef-828a-c9f2a281354f>
CC-MAIN-2019-26
https://securityintelligence.com/preparing-next-era-computing-quantum-safe-cryptography/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998879.63/warc/CC-MAIN-20190619003600-20190619025600-00267.warc.gz
en
0.939201
1,387
3.5625
4
News and updates How to Explain Post Quantum Cryptography to Anyone Dr Michael Scott Its actually not as complicated as it sounds. Let's get the maths over with first. Remember polynomials? This would be an example of two first degree polynomials being multiplied together to create a second degree polynomial (or quadratic). In general two n-th degree polynomials when multiplied together create a 2n-th degree polynomial result. Polynomials can also be added (3x+5)+(5x+6) = 8x+11 Don't tell me that's hard! For the polynomial 8x+11, the coefficients are 8 and 11. Next consider polynomials where all of the coefficients are less than a fixed prime number q. If they ever get greater than q, they are reduced to their remainder when divided by q. So if q=7, then because 8 leaves a remainder of 1 when divided by 7, and 11 leaves a remainder of 4 when divided by 7. That's it for the maths. The next thing we will do is scale it up a little(!) Lets choose q=12289, and consider polynomials of degree 1024. Such a polynomial will look like We've shortened it a bit, but you get the idea. Again it's easy to multiply two such polynomials,although obviously a computer is needed to do it. In fact due to the cunning choice for q and the degree as a power of 2 (1024=210), there is a particularly fast way to do the multiplication. Now normally when we multiply two such polynomials, we get a 2048-th degree polynomial product. But here instead we chop this into two 1024-degree halves, and subtract the top half from the bottom half. That's our result, another 1024-degree polynomial. So now we can quickly add, subtract and multiply 1024 degree polynomials to our hearts content in any order, generating 1024 degree polynomial results whose coefficients are all less than q. We are now ready to do some crypto. First some notation. A polynomial as above with large coefficients we should denote as F(x), but we will simply call it F. We will also make use of polynomials with small coefficients, like We will denote these with lower case letters, like f. Note that they are small only in terms of their coefficients, they are still of high degree. We shall call a polynomial with large coefficients a "large polynomial", and a polynomial with small coefficients a "small polynomial". Now consider this calculation with such polynomials Given A, s and e, its easy to calculate B, its just a multiplication followed by an addition. However given B and A, its very hard to calculate s and e. Think about it for a while - or just take my word for it! In fact for the size of polynomials we are talking about here its impossible even if you have a quantum computer! We call the small polynomial e an error polynomial, and the small polynomial s is often a secret. The large polynomial A is often a globally known value. OK let's do some crypto. Alice and Bob who have no prior arrangement or shared secrets (although they both know a public large polynomial A), nevertheless want to communicate in private. In fact Bob wants to transmit an encrypted message to Alice that only Alice can read. First Bob encodes his message as a large polynomial M. Alice then calculates B=As+e where s and e are small secret and error polynomials of her own invention, and sends B to Bob. Bob calculates U=At+f and C=Bt+g+M and sends U and C back to Alice, where t, f and g are small polynomials of his own invention. Finally Alice calculates C-Us. Substituting for C and U, and then for B, this becomes et+g+M-fs (there is some fortuitous cancellation). Check it for yourself. At this stage the reader might well feel a little bewildered, and be wondering – so what? But this is the clever bit. Observe that in the expression et+g+M-fs, only M is a large polynomial. The other components are all small. So in effect M stands out from the "noise", and can thus be recovered by Alice. So Alice got the message, and anyone who eavesdropped their communication gets a problem they cannot possibly solve. That's basically it. This is the NewHope post-quantum key exchange protocol as described by Alkim, Ducas, Poppelmann and Schwabe. The strength of the protocol depends on the difficulty of the so-called Ring Learning with Errors problem, which is a problem based on lattices, and for which there is no known effective quantum algorithm. There is of course a bit more to it (!) than given in this simple description, mainly concerning the ways in which the small polynomials are generated, and the statistical distribution of their small coefficients. This needs to be done carefully to avoid some attacks in some contexts. There is also the issue of effective encoding and decoding of the message M. But these are really just details. At this stage I hope you are thinking – that was surprisingly easy. In fact post-quantum crypto, in my humble opinion, is often quite shallow mathematically. Its also blazingly fast. The downside is that the amount of data that must be exchanged is relatively large – those large polynomials are seriously big chunks of data. But it works!
<urn:uuid:a1ec85ff-876c-45e9-b997-5b5a49f48fe7>
CC-MAIN-2019-26
https://miracl.com/press/miracl-labs/post-quantum-cryptography-for-grandparents
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627997731.69/warc/CC-MAIN-20190616042701-20190616064701-00311.warc.gz
en
0.947655
1,203
3.875
4
Researchers successfully sent a simulated elementary particle back in time Don't start investing in flux capacitors just yet, though. - The second law of thermodynamics states that order always moves to disorder, which we experience as an arrow of time. - Scientists used a quantum computer to show that time travel is theoretically possible by reverting a simulated particle from an entropic to a more orderly state. - While Einstein's general theory of relativity permits time travel, the means to achieve it remain improbable in nature. In 1895 H.G. Wells published The Time Machine, a story about an inventor who builds a device that travels through a fourth, temporal dimension. Before Wells's novella, time travel existed in the realm of fantasy. It required a god, an enchanted sleep, or a bonk on the head to pull off. After Wells, time travel became popularized as a potentially scientific phenomenon. Then Einstein's equations brought us into the quantum realm and there a more nuanced view of time. No less than mathematical logician Kurt Gödel worked out that Einstein's equations allowed for time travel into the past. The problem? None of the proposed methods of time travel were ever practical "on physical grounds." So, "Why stick to physical grounds?" asked scientists from the Argonne National Laboratory, the Moscow Institute of Physics and Technology, and ETH Zurich before they successfully sent a simulated elementary particle back in time. Fair warning: their results are tantalizing but will ultimately dishearten any time lords in training. The great quantum escape A quantum computer mixing chamber (Photo: IBM Research/Flickr) Many of the laws of physics view the future and the past as a difference without a distinction. Not so with the second law of thermodynamics, which states that a closed system always moves from order to disorder (or entropy). Scramble an egg to make your omelet, for example, and you've added a whole lot of disorder into the closed system that was the initial egg. This leads to an important consequence of the second law: the arrow of time. A process that generates entropy — such as your egg whisking — will be irreversible unless you input more energy. It's why an omelet won't reform back into an egg or why billiard balls don't spontaneously reform a triangle after the break. Like an arrow released, the entropy moves in a single direction, and we witness the effect as time. We are trapped by the second law of thermodynamics, but the international team of scientists wanted to see if the second law could be violated in the quantum realm. Since such a test is impossible in nature, they used the next best thing: an IBM quantum computer. Traditional computers, like the one you are reading this on, use a basic unit of information called a bit. Any bit can be represented as either a 1 or a 0. A quantum computer, however, uses a basic unit of information called a qubit. A qubit exists as both a 1 and a 0 simultaneously, allowing the system to compute and process information much faster. In their experiment, the researchers substituted these qubits for subatomic particles and put them through a four-step process. First, they arranged the qubits in a known and ordered state and entangled them — meaning anything that happened to one affected the others. Then they launched an evolution program on the quantum computer, which used microwave radio pulses to break down that initial order into a more complex state. Third step: a special algorithm modifies the quantum computer allow disorder to more to order. The qubits are again hit with a microwave pulse, but this time they rewind to their past, orderly selves. In other words, they are de-aged by about one millionth of a second. According to study author Valerii M. Vinokur, of the Argonne National Laboratory, this is the equivalent of pushing against the ripples of a pond to return them to their source. Since quantum mechanics is about probability (not certainty), success was no guarantee. However, in a two-qubit quantum computer, the algorithm managed a time jump an impressive 85 percent of the time. When it was upped to three qubits, the success rate dropped to about 50 percent, which the authors attributed to imperfections in current quantum computers. The researchers published their results recently in Scientific Reports. Bringing order from chaos The results are fascinating and spur the imagination, but don't start investing in flux capacitors yet. This experiment also shows us that sending even a simulated particle back in time requires serious outside manipulation. To create such an external force to manipulate even one physical particle's quantum waves is well beyond our abilities. "We demonstrate that time-reversing even ONE quantum particle is an unsurmountable task for nature alone," study author Vinokur wrote to the New York Times in an email [emphasis original]. "The system comprising two particles is even more irreversible, let alone the eggs — comprising billions of particles — we break to prepare an omelet." A press release from the Department of Energy notes that for the "timeline required for [an external force] to spontaneously appear and properly manipulate the quantum waves" to appear in nature and unscramble an egg "would extend longer than that of the universe itself." In other words, this technology remains bound to quantum computation. Subatomic spas that literally turn back the clock aren't happening. But the research isn't solely a high-tech thought experiment. While it will not help us develop real-world time machines, the algorithm does have the potential to improve cutting-edge quantum computation. "Our algorithm could be updated and used to test programs written for quantum computers and eliminate noise and errors," study author Andrey Lebedev said in a release. Is non-simulated time travel possible? As Kurt Gödel proved, Einstein's equations don't forbid the concept of time travel, but they do set an improbably high hurdle to clear. Writing for Big Think, Michio Kaku points out that these equations allow for all sorts of time travel shenanigans. Gödel found that if the universe rotated and someone traveled fast enough around it, they could arrive to a point before they left. Time travel could also be possible if you traveled around two colliding cosmic strings, traveled through a spinning black hole, or stretched space via negative matter. While all of these are mathematically sound, Kaku points out that they can't be realized using known physical mechanisms. Similarly, the ability to nudge physical particles back in time remains beyond our reach. Time travel remains science fiction for all intents and purposes. But time travel may one day become an everyday occurrence in our computers, making us all time lords (in a narrow sense). - Time Travel Simulation Resolves “Grandfather Paradox” - Scientific ... › - Time Travel Is Possible: Scientists Have Already Built A Time ... › What can 3D printing do for medicine? The "sky is the limit," says Northwell Health researcher Dr. Todd Goldstein. - Medical professionals are currently using 3D printers to create prosthetics and patient-specific organ models that doctors can use to prepare for surgery. - Eventually, scientists hope to print patient-specific organs that can be transplanted safely into the human body. - Northwell Health, New York State's largest health care provider, is pioneering 3D printing in medicine in three key ways. Beyond Beef sizzles and marbleizes just like real beef, Beyond Meat says. - Shares of Beyond Meat opened at around $200 on Tuesday morning, falling to nearly $170 by the afternoon. - Wall Street analysts remain wary of the stock, which has been on a massive hot streak since its IPO in May. - Beyond Meat faces competition from Impossible Foods and, as of this week, Tyson. The most valuable college majors will prepare students for a world right out a science fiction novel. - The future of work is going to require a range of skills learned that take into account cutting edge advancements in technology and science. - The most valuable college majors in the future will prepare students for new economies and areas of commerce. - Mathematics, engineering and science related educational majors will become an ubiqitous feature of the new job market. A recent study used data from the Big Five personality to estimate psychopathy prevalence in the 48 contiguous states and Washington, D.C. - The study estimated psychopathy prevalence by looking at the prevalence of certain traits in the Big Five model of personality. - The District of Columbia had the highest prevalence of psychopathy, compared to other areas. - The authors cautioned that their measurements were indirect, and that psychopathy in general is difficult to define precisely. SMARTER FASTER trademarks owned by The Big Think, Inc. All rights reserved.
<urn:uuid:32d813b1-594f-43e3-815e-17563a249080>
CC-MAIN-2019-26
https://bigthink.com/surprising-science/particle-time-travel?rebelltpage=3
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998943.53/warc/CC-MAIN-20190619083757-20190619105757-00031.warc.gz
en
0.935498
1,818
3.625
4
How Will Quantum Computing Change The World? Firstly, I would like to explain what the concept of ‘Quantum Computing’ is. What is quantum computing? Quantum computing takes advantage of the ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers. In classical computing, a bit is a single piece of information that can exist in two states 1 or 0. Quantum computing uses quantum bits. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values. Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement. Now let’s break these two concepts down: Now what exactly is superposition? Superposition is the ability of a quantum system to be in multiple states at the same time until it is measured. Because the concept is difficult to understand, this essential principle of quantum mechanics is often illustrated by an experiment carried out in 1801 by the English physicist, Thomas Young. Young’s double-slit experiment was intended to prove that light consists of waves. Today, the experiment is used to help people understand the way that electrons can act like waves and create interference patterns. For this experiment, a beam of light is aimed at a barrier with two vertical slits. The light passes through the slits and the resulting pattern is recorded on a photographic plate. When one slit is covered, the pattern is what would be expected: a single line of light, aligned with whichever slit is open. Intuitively, one would expect that if both slits are open, the pattern of light will reflect two lines of light aligned with the slits. In fact, what happens is that the photographic plate separates into multiple lines of lightness and darkness in varying degrees. What is being illustrated by this result is that interference is taking place between the waves going through the slits, in what, seemingly, should be two non-crossing trajectories. Each photon not only goes through both slits; it simultaneously takes every possible trajectory on route to the photographic plate. Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated or interact in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole. Now that I have explained these two phenomena, I can now show how quantum computing will change the world. Firstly, we need to clear a misconception about quantum computing, the misconception is that we have become so accustomed to advances in computing being reflected in slimmer, faster laptops and bigger memories that quantum computing is often envisaged in the same terms. It shouldn’t be. Digital computers manipulate information encoded in binary form as sequences of ones and zeros the rest is software, whether that involves converting keystrokes or mouse movements into images, or taking numbers and feeding them into an equation to work out the answer. Quantum computers are no different, except in one crucial respect. In a conventional computer, one bit of binary data can have one of just two values: one or zero. But in a quantum computer, these switches, called quantum bits have more options, because they are governed by the laws of quantum theory. Thanks to superposition, qubits can, in effect, encode one and zero at the same time. As a result, quantum computers can represent many more possible states of binary ones and zeros. A classical bit can represent two states: zero and one. Add a bit to your computer’s processor and you can encode one more piece of binary information. Yet if a group of qubits are placed in a joint superposition, called an entangled state, each additional qubit doubles the encoding capacity. By the time you get to 300 qubits – as opposed to the billions of classical bits in the dense ranks of transistors in your laptop’s microprocessors – you have 2 options. That’s more than the number of atoms in the known universe. Quantum computers have largely been advertised on the promise that they will be vastly faster at crunching through calculations than even the most powerful of today’s supercomputers. This speed-up – immensely attractive to scientists and analysts solving complex equations or handling massive data sets – was made explicit in 1994 when the American mathematician Peter Shor showed in theory that a computer juggling coherent qubits would be able to factor large numbers much more efficiently than classical computers. Reducing numbers to their simplest factors – decomposing 12 to “two times two times three”, for example – is an exercise in elementary arithmetic, yet it becomes extremely hard for large numbers because there’s no shortcut to trying out all the possible factors in turn. Factorising a 300-digit number would take current supercomputers hundreds of thousands of years, working flat out. For this reason, a lot of data encryption – such as when your credit card details are sent to verify an online purchase – uses codes based on factors of large numbers, which no known computer can crack. Yet Shor showed that a quantum factorisation algorithm could find factors much more efficiently than a classical one can. As well as factorisation, quantum computation should be able to speed up database searches – and there’s no question how useful that would be, for example in combing through the masses of data generated in biomedical research on genomes. One of the likely first big applications of quantum computing isn’t going to set the world of personal computing alight, but it could transform an important area of basic science. Computers operating with quantum rules were first proposed in 1982 by the American physicist Richard Feynman. He wasn’t concerned with speeding up computers, but with improving scientists’ ability to predict how atoms, molecules and materials behave using computer simulations. Atoms observe quantum rules, but classical computers can only approximate these in cumbersome ways: predicting the properties of a large drug molecule accurately, for example, requires a state-of-the-art supercomputer. Quantum computers could hugely reduce the time and cost of these calculations. In September, researchers at IBM used the company’s prototype quantum computer to simulate a small molecule called beryllium dihydride. A classical computer could, it’s true, do that job without much trouble – but the quantum computer doing it had just six qubits. With 50 or so qubits, these devices would already be able to do things beyond the means of classical computers. To conclude, I this essay I have outlined what a quantum computer is and how it can change the world. The last brief point I would like to address is that though a quantum computer is on the way there are many questions on how long it could take. another big difficulty is dealing with errors. Given the difficulty of keeping qubits coherent and stable, these seem inevitable: qubits are sure to flip accidently now and again, such as a one changing to a zero or getting randomised. Dealing with errors in classical computers is straightforward: you just keep several copies of the same data, so that faulty bits show up as the odd one out. But this approach won’t work for quantum computing, because it’s a fundamental and deep property of quantum mechanics that making copies of unknown quantum states (such as the states of qubits over the course of a computation) is impossible. Developing methods for handling quantum errors has kept an army of researchers busy over the past two decades. It can be done, but a single error-resistant qubit will need to be made from many individual physical qubits, placing even more demands on the engineering. You can only access the opportunities that the quantum computer holds, if all the qubits are mutually dependent: in a collective or “coherent” state, which, crudely speaking, means that if we do something to one of them (say, flip a one to a zero), all the others “feel” it. Generally, this requires all the qubits to be placed and maintained in an entangled state. The difficulty of making a quantum computer mostly involves making and sustaining these consistent states of many qubits. Quantum effects such as superposition and entanglement are delicate and easily disrupted. The jangling atomic motions caused by heat can wash them away. So, to be consistently entangled, qubits must be cooled to extremely low temperatures – we’re typically talking less than a degree above absolute zero (-273° C) – and kept well isolated from the laboratory environment: that is, from the very equipment used to manipulate and measure them. That’s partly why the IBM quantum computer I saw is so bulky: much of it consists of cooling equipment and insulation from the lab environment.
<urn:uuid:1af7901b-934e-42a3-a70f-e7988c56e40e>
CC-MAIN-2019-26
https://weberscustominteriors.com/how-will-quantum-computing-change-the-world/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999291.1/warc/CC-MAIN-20190620230326-20190621012326-00393.warc.gz
en
0.945088
1,871
3.6875
4
Author: Sarah Kearns Editors: David Mertz, Zuleirys Santana Rodriguez, and Scott Barolo In a previous post, we discussed how proteins fold into unique shapes that allow them to perform their biological functions. Through many physical and chemical properties, like hydrogen bonding and hydrophobicity, proteins are able to fold correctly. However, proteins can fold improperly, and sometimes these malformed peptides aggregate, leading to diseases like Alzheimer’s. How can we figure out when the folding process goes wrong? Can we use computers to figure out the folding/misfolding process and develop methods to prevent or undo the damage done by protein aggregates? In the late 1960s, a scientist named Cyrus Levinthal noted that protein folding is different from regular chemical reactions. Chemical reactions proceed from a reactant to a product via a set pathway of structures and intermediates. Proteins do not do this because a protein doesn’t find just one intermediate shape as it folds — it can potentially find millions. Levinthal concluded that a new protein, moving through so many intermediate structures, must take an enormously long time to find its final native state. To understand the vast number of conformational possibilities, let’s take a polypeptide of 101 amino acids. There will be a total of 100 bonds connecting amino acids, each bond having six possible conformations (see Figure 1). This means that a protein of 101 amino acids has 3100, or 5*1047, configurations—and some proteins are five or ten times longer! Even if our 101-amino acid protein were able to sample 1013 conformations per second, it would still need 1027 years to try all possible shapes. However, in reality, it takes seconds, not eons, for a protein to find its native conformation. This leads to a big question: Can humans predict how proteins will fold? Even with the help of computers, which can test each possible shape in microseconds, testing them all would require 30 years of computation just for one protein. Simplifying Structure Prediction Protein structures, such as hydrogen and ionic bonding and hydrophobic interactions, are difficult to predict rationally just based on the amino acid sequence. Instead, a database of protein structures found by x-ray crystallography, called the Protein Data Bank, has been more helpful in determining the rules of protein folding. Still, determining protein structures accurately is difficult and time-consuming. Some computational shortcuts have made the process simpler, but the predicted folds still are not exact. The biggest simplifications are made by assuming a lattice structure or using a coarse-grained representation. The former takes a globular protein that typically has variable bond lengths between each amino acid into a lattice (has uniform bond lengths) and places each residue into a 3D grid structure thus limiting the number of possibilities the possible placements of each amino acid. A coarse-grained model would simplify a protein structure by representing amino acids as a single point (see Figure 2). So far, computational prediction of protein structures is limited to these simpler models because more realistic all-atom energy diagrams are too complex and computationally heavy. In our protein of 101 amino acids, there are close to 2000 atoms to move around in 3100 configurations. With the advent of quantum computing, such problems are becoming easier to solve, but for now, they still use coarse-grained representations. How Your PC Can Help Mine Data Some researchers have turned such computational problems into citizen science projects. Perhaps the most famous of these is FoldIt, developed by the Center for Game Science and the Department of Biochemistry at the University of Washington. Foldit is an online game where players compete to create accurate protein structures by moving around the backbone chain, amino acid residues, and domains. Players score points by packing the protein, hiding hydrophobic residues, and clearing any clashes between side chains to minimize the energy of the overall structure. The lowest-energy conformations from the game are then collected and analyzed to improve real-life folding algorithms. A less hands-on folding program is Folding@home from Stanford University, which borrows unused processors on your personal computer to work on a folding algorithm. While users check their emails or listen to music, or even when the screensaver runs, their computers solve structures and compute minimization functions. All this data has gone towards the goal of figuring out both how malformed proteins aggregate and how to design drugs that will prevent misfolding. FoldIt has already produced a retrovirus structure that is being used to determine inhibitors of HIV. One of the labs behind FoldIt has been focusing on proteins involved in cancer, AIDS, and other diseases. The Folding@home project has produced about 130 peer-reviewed papers describing its accomplishments in simulating, not only protein folding but also molecular dynamics, which help determine the ability for drugs to bind. Having an idea of what the protein does and where it does it, without having to use expensive machines to do crystallography (to get the structure of a protein) or high-throughput screening (to find the substrates of a protein), saves both time and resources when developing a drug. More work has to be done before computational predictions perfectly line up with crystal structures. But when that day comes, we will be much closer to understanding how proteins work, and how to cure diseases of protein folding and function. About the author Read all posts by Sarah here. Figure 1: Sarah Kearns Figure 2: Sarah Kearns
<urn:uuid:be13d2fd-7521-449f-98c9-7fef43df22a1>
CC-MAIN-2019-26
https://misciwriters.com/2017/03/14/computing-levinthals-paradox-protein-folding-part-2/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999482.38/warc/CC-MAIN-20190624104413-20190624130413-00273.warc.gz
en
0.942798
1,137
3.875
4
Quantum computers are often seen in Sci-Fi movies like I, Robot and Eagle Eye as computers with capabilities beyond those of present day computers. The computer named Becky in Eagle Eye was especially memorable, attempting to kill government officials including the president by hacking into high security systems. How were quantum computers able to express feelings and hack into everything in these movies? Today, I’d like to introduce you to the world of quantum computers. Definition and Mechanism of Quantum Computers Quantum computers are computers that process data using the principles of quantum mechanics such as entanglement and superposition. They are also called ‘future computers’ that surpass super computers by using quantum mechanics as their computing element. It was Richard Feynman, a theoretical physicist in the US, who discovered the potential and necessity of quantum computers in 1982, and David Deutsch from Oxford University defined it in more specifics. Let’s take a look at the mechanism of quantum mechanics utilized in quantum computers. Quantum computers make use of two basic phenomena: quantum superposition and entanglement. Quantum superposition is when electrons are held in an arbitrary state until protons are measured, and quantum entanglement describes an entire group of protons entering a fixed state when one of the entangled protons is observed. These physical phenomena may be quite complicated to grasp. Let’s think about the cat experiment from Erwin Schrodinger, an Austrian physicist, famously known as Schrodinger’s cat. Let’s say there’s a sealed box with potassium cyanide and a cat in it. Until we open the box and take a look inside, we wouldn’t know if the cat is dead or alive. This means that until the moment we open the box we cannot know whether the cat is dead or not, and that the situation is fixed simultaneously when the observation is made. Based on these two phenomena, quantum computers use qubit, a unit at a quantum state, for calculations. Unlike in digital computers that use bits, in two states of 0 and 1, quantum computers use qubit with four states: 00, 01, 10, and 11. A qubit can be 0 or 1, and 0 and 1 can both exist as well. Therefore, quantum computing means it’s processing in a state where the decision of whether something is at 0 or 1 can’t be made. The computing speed goes up in the form of , and if there are 512 qubits the speed is going to be as fast as . D-Wave, the First Commercial Quantum Computer Quantum computer D-Wave In 2011, D-Wave System launched a 128 qubit quantum computer D-Wave 1 as the first commercial quantum computer, and 512 qubit D-Wave 2 in 2013. Its price was set at $10 million and Lockheed Martin, Google, and NASA were some of the first buyers of their new product. Because D-Wave uses a superconductor made of niobium, it has to be used at a temperature of absolute zero (-273℃). The photo above makes it look like it’s a giant computer, but the actual computer inside is as small as a human fist. The outer part of the computer is made of a liquid helium cooling system which keeps it at the absolute zero degree and also lowers noise. Unlike the quantum computer defined by David Deutsch, it works based on quantum annealing phenomenon to solve optimization (NP-complete) problems. Quantum annealing here means the process of electrons finding the most stable state called the ground state while the risen temperature of the object is dropping. Pros and Cons of D-Wave According to Google’s D-Wave benchmark in January of 2014, quantum computers show much higher speed in solving optimization problems compared to general PCs. Although there are reports that say its speed is sometimes slower than PCs, they seem to be faster than PCs, on average, to solve optimization problems involving data with regularity. D-WAVE has three big cons. First, despite being called the world’s first commercial quantum computer, it’s a shame that D-Wave is not considered a real quantum computer at the same time. This is because it’s designed to have an external computer read the processing results from the quantum CPU. Some may refer to it as just a “half-quantum” computer which consists of the regular workstation with a qubit CPU on the side. Second, the CPU generates heat while operating, and the noise made while running the cooler to lower the temperature can create computing errors. The size of the computer is also quite large due to the big cooling unit on top to stabilize the temperature of absolute zero. Finally, D-Wave is made based on the tunnel effect of quantum annealing unlike the formerly defined quantum computers. The tunnel effect here means the phenomenon where a particle stochastically tunnels through the energy barrier higher than its own potential energy. As a result, its computing speed is not overwhelmingly faster than in existing computers except for in particular calculating operations. Prospects of Quantum Computers Quantum computers still don’t have a lot of algorithms that solve general problems, so it only uses certain algorithms for specific operations such as prime factorization or Fourier transform. Once the technology is found that can maintain algorithms and qubits to solve these problems and find a room temperature superconductor through which electric resistance gets close to zero at room temperature, quantum computers will become popular enough for us to see them in our daily lives. There are new materials and products utilizing quantum mechanics these days, including TVs using quantum dots. When this kind of quantum communication is used in our lives, tapping and data interception will become impossible. I think the day when we’ll see these technologies go beyond our imagination and appear in our real lives is not so far away. I look forward to seeing more quantum computers in the future. Written by Deok Lee, University student reporter for LG CNS http://navercast.naver.com/contents.nhn?rid=122&contents_id=31579 [back to the article] D-Wave experiment report (https://plus.google.com/+QuantumAILab/posts/DymNo8DzAYi) [back to the article] http://www.dongascience.com/sctech/view/720 [back to the article]
<urn:uuid:c06c108a-706d-4935-aec8-9ac077d41062>
CC-MAIN-2019-26
http://www.lgcnsblog.com/features/quantum-computers-a-step-above-your-average-computer/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999946.25/warc/CC-MAIN-20190625192953-20190625214953-00315.warc.gz
en
0.930386
1,344
3.59375
4
The ability to produce arbitrarily superposed quantum states is a prerequisite for creating a workable quantum computer. Such highly complex states can now be generated on demand in superconducting electronic circuitry. As an innocent reductionist in elementary school, I dreamed of creating everything in the world by assembling atoms one by one, just like building with Lego blocks. Decades later, the dream has, to some extent, come true with the 'bottom-up' approach of nanotechnology in which, for example, single atoms can be manipulated and assembled using the tip of a scanning probe microscope. But physicists are now playing with even fancier — and often more fragile — 'quantum Lego blocks'. Using a bottom-up approach, Hofheinz et al.1 (page 546 of this issue) report on-demand synthesis of arbitrary quantum states in a superconducting resonator circuit. Starting from a vacuum (zero-photon) state, the authors pile up photons one by one in the resonator and create complex quantum states in an entirely deterministic way. Quantum mechanics was founded and, to a great extent, developed during the last century. Despite its weird and counterintuitive predictions, such as the uncertainty principle and the superposition and entanglement of states, it has stood up to a number of tests, and has proved itself to be a rigorous foundation across a broad spectrum of physics fields, from particle physics to solid-state physics. But only relatively recently have people recognized that the paradoxical nature of quantum mechanics is in itself useful in many applications, such as quantum cryptography and quantum computation. This recognition has boosted research on technologies of quantum-state engineering in various types of physical setting, and the twenty-first century will hopefully be memorable for the implementation of such technologies. Among physical systems currently being investigated, superconducting (zero-resistance) macroscopic circuits stand in a unique position. Although the naive expectation is that quantum mechanics is normally associated with single microscopic systems such as atoms, nuclei and electrons, it has been shown that quantum-mechanical behaviour can be observed and controlled in human-designed, superconducting circuits that are micrometres or even millimetres in size2. The simplest example of a superconducting quantum circuit is a linear resonator consisting of an inductor and a capacitor. If proper parameters are chosen, such a circuit can store a number of energy quanta (photons) at a microwave frequency. Another example is a quantum bit (or qubit), which is an effective two-state system. It can be implemented using a Josephson junction — a tunnel junction between two superconductors — as a nonlinear inductor; the two states are the ground and the first excited state of the nonlinear circuit. Coherent control of quantum states in such circuits and their combinations3 is the basis of superconducting quantum-state engineering. To synthesize quantum states in a resonator, Hofheinz et al.1 use a circuit (see Fig. 1a on page 546) in which a resonator is coupled to a qubit. Because it is not possible to create arbitrary quantum states in resonator circuits using classical control signals alone, the qubit is used as a 'forklift' to load photons one by one into the resonator. Each cycle consists of two sequential steps. First, the qubit, initially detuned off-resonant with the resonator, is excited by a microwave pulse. Then, the qubit energy level is tuned into resonance with the resonator, enabling coherent transfer of energy quanta. A similar technique was proposed for an optical cavity with an atom inside4, and was demonstrated for the motional states of an ion in a trap5. Hofheinz and colleagues have also previously reported6 generation of states with a certain number of photons (N-photon or Fock states) — with up to 15 photons7 — based on the same scheme. In their new study1, they perfect the scheme to precisely control not only the amplitude but also the phase of each quantum loading. This allows them to synthesize, in a completely deterministic manner, quantum states that are the largest-ever arbitrary superpositions of multiple N-photon states. Another distinctive aspect of Hofheinz and colleagues' experiment is the quantitative characterization and visualization (Fig. 1) of the generated quantum states, which they attained using Wigner tomography. This method fully characterizes, by means of the Wigner function, the resonator's quantum state: just as tomography is used in medical diagnoses such as magnetic-resonance and X-ray imaging, Wigner tomography allows the quantum state to be completely reconstructed from a large number of measurements. In this case, such measurements were taken by using the same qubit, now as a diagnostic probe, to unload energy quanta from the resonator. In the past year, an analogous technique was used to characterize the quantum state of a microwave field in a three-dimensional cavity, using atoms passing through the cavity as probes of the radiation field8. Comparison1 of the observed and simulated Wigner functions (Fig. 1) clearly indicates that the target quantum states were synthesized with high fidelity. The ability to accurately create and control superposed quantum states is the first requisite for quantum computing. Moreover, coupling between qubits and resonators, such as that achieved in this study, has already shown its value in the implementation of quantum gates — the analogues of logic gates in conventional computers — between remote qubits9,10. That said, the complexity and accuracy of the quantum states achieved by the authors is limited by decoherence — that is, the vulnerability of the quantum superposition. In superconducting circuits, quantum coherence tends to be lost more quickly than in atoms. This is not surprising if one considers the macroscopic nature of the circuits, which makes them interact more strongly with their surroundings. Efforts to achieve longer coherence times are ongoing, and include improving circuit design and reducing the number of defects in the materials from which circuit components are made. Studying the decay of coherence in a variety of quantum states will be a valuable approach to understanding what mechanism triggers decoherence itself and the crossover from quantum to classical behaviour7,8. For now, Hofheinz and colleagues' experiment has set the stage for further developments in quantum-state engineering in superconducting electronic circuitry, and has brought physicists a step closer to realizing a workable quantum computer.
<urn:uuid:5b5bc333-5278-4a4d-b9d2-4e764453c714>
CC-MAIN-2019-26
https://www.nature.com/articles/459516a?error=cookies_not_supported&code=d5fdd9eb-962a-4559-8305-9894c7db7552
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000575.75/warc/CC-MAIN-20190626214837-20190627000503-00004.warc.gz
en
0.936459
1,342
3.578125
4
Have you ever tried to match wits with a computer? Perhaps you’ve tried playing it in a game of chess or raced to perform a calculation before your laptop could spit out the correct answer. You probably lost the chess game and the computer definitely beat you in the mathematics race. Given that, when you measure the ability of the human brain vs. a computer at face value, it seems like a computer would be faster and smarter, but there is actually far more to the story. If you had posed this same question a few decades ago, there would be no question… the human brain could run circles around computers, but is that still true? Has technology begun to catch up to the most remarkable and awe-inspiring organ in the human body? Old Ideas Aren’t Always the Best Since the inception of the first computers, there has been a direct comparison between these “computational machines” and the human brain. One of the common phrases that has stuck around for decades, and which encourages the idea of a brain vs. computer argument, is “brains are analogue, computers are digital”. This makes it seem like computers are superior, but in truth, the human brain is far more advanced and efficient, and possesses more raw computational power than the most impressive supercomputers that have ever been built. At the time of this writing, the fastest supercomputer in the world is the Tianhe-2 in Guangzhou, China, and has a maximum processing speed of 54.902 petaFLOPS. A petaFLOP is a quadrillion (one thousand trillion) floating point calculations per second. That’s a huge amount of calculations, and yet, that doesn’t even come close to the processing speed of the human brain. In contrast, our miraculous brains operate on the next order higher. Although it is impossible to precisely calculate, it is postulated that the human brain operates at 1 exaFLOP, which is equivalent to a billion billion calculations per second. In 2014, some clever researchers in Japan tried to match the processing power in one second from one percent of the brain. That doesn’t sound like very much, and yet it took the 4th fastest supercomputer in the world (the K Computer) 40 minutes to crunch the calculations for a single second of brain activity! Brains Are Very Different From Computers When we discuss computers, we are referring to meticulously designed machines that are based on logic, reproducibility, predictability, and math. The human brain, on the other hand, is a tangled, seemingly random mess of neurons that do not behave in a predictable manner. Biology is a beautiful thing, and life itself is much smarter than computers. For example, the brain is both hardware and software, whereas there is an inherent different in computers. The same interconnected areas, linked by billions of neurons and perhaps trillions of glial cells, can perceive, interpret, store, analyze, and redistribute at the same time. Computers, by their very definition and fundamental design, have some parts for processing and others for memory; the brain doesn’t make that separation, which makes it hugely efficient. The same calculations and processes that might take a computer a few millions steps can be achieved by a few hundred neuron transmissions, requiring far less energy and performing at a far greater efficiency. The amount of energy required to power computations by the world’s fastest supercomputer would be enough to power a building; the human brain achieves the same processing speeds from the same energy as is required to charge a dim lightbulb. Biological processes have had billions of years to evolve perfect, efficient organs that far supersede technology, and we are beginning to reach those artificial “limitations”. One of the things that truly sets brains apart, aside from their clear advantage in raw computing power, is the flexibility that it displays. Essentially, the human brain can rewire itself, a feat more formally known as neuroplasticity. Neurons are able to disconnect and reconnect with others, and even change in their basic features, something that a carefully constructed computer cannot do. We see this amazing transformative feat in a wide variety of brain functions, such as the formations of memory, knowledge acquisition, physical development, and even recovery from brain damage. When the brain identifies a more efficient or effective way to compute and function, it can morph and alter its physical and neuronal structure, hence the term “plasticity“. Until we achieve true Artificial Intelligence (in which computers should theoretically be able to re-wire themselves), neuroplasticity will always keep the human brain at least one step ahead of “static” supercomputers. Looking Towards the Future If there is one thing about human beings, it’s that we don’t like being told something is impossible. Therefore, now that we have a clear goal that is nearly in sight (a computer that operates at the exaFLOP level), we have begun to pay more attention (and spend more money) towards achieving it. For example, the Human Brain Project has the ultimate goal of reaching exascale computing (computing at the same processing power and speed as the human brain; an artificial brain, so to speak). Launched in 2013, the Human Brain Project has already sourced billions of euros for this project, which could have hugely important ramifications in many different industries. The fastest supercomputers created thus far (like the one seen above) haven’t even breached the 50 petFLOP mark, which is still 20 times slower than the human brain’s processing speed, not to mention…they’re massive! Experts believe that exascale computing could be possible by 2020, but Intel, one of the largest technology companies in the world, boasted that they will have achieved that capability by 2018. By creating a legitimate artificial brain modeling, we will be able to explore real-time simulations of human brain activity – a major breakthrough. Furthermore, major interests ranging from engineering and basic research to national security agencies and telecommunication giants are eager to see what this dreamt-of level of technological advancement will bring. However, as we explained above, there are some serious issues with reaching this level of technical sophistication, namely energy, memory, and physical constraints. Even with new advancements in graphene transistors and the complex possibilities of quantum computing, a purely artificial brain on par with the real thing seems out of reach – for now. The recent stall in any new supercomputers at the top of the “Fastest List” has made some people question the possibilities, but these new advancements may pay off in a major way, which would launch us into a new generation. If and when that happens, the answer to “who would win, the human brain or a supercomputer” might be different!
<urn:uuid:0f53c630-375a-4c29-9427-054a318aa395>
CC-MAIN-2019-26
https://www.scienceabc.com/humans/the-human-brain-vs-supercomputers-which-one-wins.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998580.10/warc/CC-MAIN-20190617203228-20190617225228-00118.warc.gz
en
0.947477
1,420
3.515625
4
In 1965, Intel co-founder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years and predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems. That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations. Yet the law has been fraying for years and experts predict that it will soon reach its limits. However, I spoke to Bernie Meyerson, IBM’s Chief Innovation Officer, and he feels strongly that the end of Moore’s Law doesn’t mean the end of progress. Not by a long shot. What we’ll see though is a shift in emphasis from the microchip to the system as a whole. Going Beyond Silicon The end of Moore’s Law is not a new issue. In fact, Meyerson argues that it first began unraveling in 2003, when insulating components within transistors began failing due to quantum mechanical effects. Since then, chip manufacturers have been finding new materials that are more resistant to decay in their basic atomic properties and progress has continued. However, sometime around 2020, these workarounds will no longer suffice as the silicon itself yields to quantum mechanical reality. Some researchers, including at IBM, are pursuing strategies like carbon nanotubes and silicon photonics that have the potential to increase chip speeds even without having to shrink chips to quantum scale. Other approaches, such as quantum computing and neuromorphic chips, change the nature of computing itself and can be exponentially more efficient for certain tasks, such as pattern recognition in the case of neuromorphic chips and encryption in the case of quantum computers. Still, you wouldn’t want either of these running your word processor. As Meyerson put it, “Quite frankly, for general purpose computing all that stuff isn’t very helpful and we’ll never develop it in time to make an impact beyond specialized applications over the next 5 or 10 years. For the practical future, we need to change our focus from chip performance to how systems perform as a whole by pursuing both hardware and software strategies.” Integrating the Integrated Circuit One way of increasing performance is by decreasing distance at the level of the system. Currently, chips are designed in two dimensions to perform specific functions, such as logic chips, memory chips and networking chips. Although none of them can do much by themselves, acting in concert they allow us to do extremely complex tasks on basic devices. So one approach to increasing performance, called 3D stacking, would simply integrate those integrated circuits into a single three-dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it would vastly reduce the time circuits need to wait for instructions from each other and increase speed significantly while decreasing power dramatically due to far shorter communication paths. In truth, this is not a new strategy but rather one that was deployed in the 1960’s to overcome a challenge called the tyranny of numbers. Simply put, the physical requirements of wiring thousands of transistors together was putting practical limitations on what could be designed and built. That’s what led to the invention of integrated circuits in the first place. Meyerson says, “when we moved from transistors to integrated circuits, we shrunk an entire rack measuring about 40 cubic feet down to a single board measuring 19 x 26 inches. 3D stacking will shrink that board down to less than a square inch and we can potentially get an increase in power performance of at least 10-100 fold. Building Intelligently Agile Systems In the 1980’s, chip manufacturers began building specialized types of chips, called ASICs, that were highly optimized for specific tasks, such as running complex financial models. These would significantly outperform conventional chips for those specific tasks, but ultimately, the process of hardwiring proved too expensive and unwieldy to be a viable strategy. Yet Meyerson sees vastly more potential in a newer approach called FPGA, that can be re-purposed on the fly through software. He points to Intel’s recent purchase of Altera as a strong indication that things are moving in that direction. It is well known that in specific applications FPGA’s can produce gains of ten-fold or more in computing performance, but most importantly, that system level gain is not restricted to a single application. The FPGA approach is a major improvement because rather than going through a roughly 18-month process to design and manufacture a specialized chip, the same thing can be done in a matter of weeks. However, Meyerson thinks the potential may actually be far greater than that if we can build intelligent software that can reprogram the chips autonomically. “So for example,” Meyerson says,” while you’re writing a document, your laptop would be configured to do exactly that, but if you then needed to run a simulation of some financial data for that same report, your system would re-optimize itself for deep computations required. Such “intelligent” architectures and the enabling software are the next grand challenge in IT.” “Take this idea a little further,” he continues “and you can see how new technologies like neuromorphic chips and quantum computing can deliver an enormous impact even as specialized systems in the cloud. Imagine being able to access the capabilities of a neuromorphic system for photo recognition and search while shopping, and then instantly switch to a quantum computer to facilitate the transaction with unbreakable encryption.” The Future of Technology is all too human Back in 1965, when Gordon Moore formulated his famous law, computers were enormous hunks that few people ever saw. After 20 years of continuous doubling, we got personal computers small enough to fit under our desks, but powerful enough to generate a graphical display and interact with us through a keyboard and a mouse. 20 more years gave us the mobile revolution. The future of technology is always more human and Meyerson expects that, ”by 2020, we’ll still be improving system performance exponentially, but we’ll have to change our conception of information technology once again, this time from machines that store, analyze and retrieve information to systems that are active partners in a very natural human/machine collaboration.” “The cognitive era will be ultimate bridge across the digital divide,” he notes, “spanning barriers of not only technology but that of language, education and skill level as well. IT will essentially become so advanced that it disappears along with previous institutional barriers. Even a teenager will have access to resources that only the most well-equipped research facilities have today and they will be able to access it in real time.” But perhaps the most important consequence of Meyerson’s vision of cognitive computing is not how it will change how we work with computers, but with each other. Before the industrial era, people were valued for their ability to do physical work. In the knowledge economy, those with strong cognitive skills were considered “the best and the brightest.” Now, we will likely see a new shift in value. In the future, when machines can do cognitive tasks more effectively than any human, we will likely find that competitive advantage will go to those who can collaborate effectively, with both people and machines. So the key to the future lies not so much in chips and algorithms as it does within ourselves. Wait! Before you go… Choose how you want the latest innovation content delivered to you: - Daily — RSS Feed — Email — Twitter — Facebook — Linkedin Today - Weekly — Email Newsletter — Free Magazine — Linkedin Group Greg Satell is a popular speaker and consultant. His first book, Mapping Innovation: A Playbook for Navigating a Disruptive Age, is coming out in 2017. Follow his blog at Digital Tonto or on Twitter @Digital Tonto.
<urn:uuid:0859c640-ab69-4ac5-874a-ed0a2a387d69>
CC-MAIN-2019-26
https://www.innovationexcellence.com/blog/2017/01/05/moores-law-will-soon-end-but-progress-doesnt-have-to/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000575.75/warc/CC-MAIN-20190626214837-20190627000504-00013.warc.gz
en
0.955438
1,689
3.5
4
A tiny cube floating and flipping in midair sounds like something straight out of "Harry Potter," but Harvard physicist Subir Sachdev doesn't need magic to levitate objects. Sachdev performed a levitation demonstration using a magnet and a superconductor during a presentation at the Perimeter Institute on Oct. 1. Superconductors are incredible materials that can conduct electricity with zero resistance. But to generate the superconductivity, the material has to be extremely cold, and so Sachdev poured liquid nitrogen that's about minus 320 degrees Fahrenheit (minus 195 degrees Celsius) on the superconductor to trigger its superconductive state. "One of the key properties of superconductors is that it hates magnetic fields," Sachdev said during his levitation demonstration. And so as the superconductor "repels" the magnet, the magnetic cube is lifted into the air. The magnet will fall after the superconductor begins to warm up again. But superconductors aren't just for levitation demonstrations, Sachdev said. [The Cool Physics of 7 Classic Toys] "The hope is that these materials will actually be useful for something," Sachdev said. Superconducting materials have to be extremely cold to maintain their superconductive state, and physicists are searching for materials that could serve as high-temperature superconductors. High-temperature superconductors could have a wide variety of applications, including in MRI machines, motors, generators, fusion reactors and low-loss power cables. Quantum mechanics 101 Physicists are still not entirely sure what gives a superconductor its magiclike properties and why superconductivity doesn't work above a certain temperature, but Sachdev said he thinks he's pretty close to the answer. But to understand how a superconductor works, "you need to know some quantum mechanics basics," Sachdev said after his levitation demonstration. The main idea of quantum mechanics is that an object like an electron or a photon behaves as both a particle and a wave, Sachdev said. "That's one of the key mysterious properties of quantum mechanics," Sachdev said. The other weird characteristic of quantum particles is that they can exist in multiple places at once, a phenomenon called superposition. But superposition is a fragile state. The moment that scientists try to measure the particles, the superposition state collapses and the particles come to exist in only one spot. Before the particles are disturbed, they exist in multiple places all at once, and "yeah, you just have to accept it," Sachdev joked during his presentation. Quantum entanglement is superposition on a larger scale, something that Sachdev described during his talk. Particles become entangled when they interact with each other. Entanglement means that when an action is performed on one particle, it directly affects that particle's entangled partner no matter how far apart they are. [How Quantum Entanglement Works (Infographic)] Sachdev said a good way to think about this is to imagine how two entangled electrons rotate. Electrons either rotate clockwise (an "up" spin) or counterclockwise (a "down" spin). "Is the left electron up or down?" Sachdev asked the audience. "The answer is really both." And this is true for both electrons. The electrons will stay in this superposition state until someone measures one of the two particles. If one electron has an up spin upon being measured, its entangled partner instantaneously acquires a down spin. This is true no matter how far apart the electrons are, even if one electron stayed on Earth and the other was beamed to the moon. Sachdev said he thinks a special kind of this quantum entanglement is responsible for the magiclike properties of superconductors. A crystalline compound called YBCO (yttrium barium copper oxide) is the first material that scientists discovered that can act as superconductor at temperatures above the boiling point of liquid nitrogen (minus 320 degrees Fahrenheit). Sachdev said the copper atoms in this substance are the most important part of the compound. The electrons around the copper atoms pair off, and "every pair of electrons is everywhere [in the material] at the same time," Sachdev said while showing a diagram of the paired electrons. This clump of entangled particles in superposition leads to superconductivity. The quantum entanglement in a superconductor is a little more complex, Sachdev said. It appears the electron pairs swap partners, creating what he calls "long-range entanglement." Learning more about long-range entanglement, Sachdev explained, will lead to better high-temperature superconductors. The basic technology already exists, but other obstacles prevent high-temperature superconductors from being used on a large scale. For example, using superconductors as power lines would require a huge startup cost, Sachdev said. "Just think about replacing all the power cables under New York," Sachdev said. - The Mysterious Physics of 7 Everyday Things - Wacky Physics: The Coolest Little Particles in Nature - The 9 Biggest Unsolved Mysteries in Physics Copyright 2014 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
<urn:uuid:7d8090bf-7657-4ebb-8342-271a6717e44f>
CC-MAIN-2019-26
https://news.yahoo.com/howd-physicist-demos-quantum-levitation-170618553.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999482.38/warc/CC-MAIN-20190624104413-20190624130413-00281.warc.gz
en
0.919333
1,092
3.5
4
Tiny robots could help turn prospective parents' dreams into reality. It depends on how a robotic in-vitro fertilization technique developed by the Advanced Micro and Nanosystems Laboratory (AMNL) at the University of Toronto pans out. Two years ago, this technique was used to "produce the world's first robotically created human fertilization," says Yu Sun, who established the lab in 2004 in the U of T Mechanical and Industrial Engineering department. Large-scale trials have yet to be conducted to determine whether the system is feasible as a standard medical tool. Microtechnology and nanotechology involve the manipulation of extremely small robots or bits of matter. To give a sense of the units of measure involved, a micrometre is one millionth of a metre, while a nanometre is a billionth. The AMNL's in vitro project focused on improving Intracytoplasmic Sperm Injection, a process used to create test tube babies. Developed in the early 1990s, the procedure allows an embryologist to gather a single sperm in a needle and inject it into an oocyte (egg cell). Given that a sperm head is about five micrometres wide, doing this procedure by hand requires a tremendous amount of precision, dexterity and accuracy. To make this process more efficient and precise, the U of T lab developed a robotic injection system. Their system analyzes the sperm "and picks out the best one. The robot then picks up the selected sperm … recognizes the egg cell and punctures the cell membrane to get the sperm in there," Dr. Sun explains. "The injected cell is incubated until it develops in vitro in a petri dish." If it looks like it is growing well, the physician transfers it into the uterus of the patient. What the lab calls the RICSI system (Robotic Intracytoplasmic Sperm Injection) was first used in human trials in 2012. Though the egg cells were successfully fertilized and transferred to the patients, they ended up having miscarriages, Dr. Sun says. Researchers want to improve the technology behind RICSI and attract funding for large-scale patient trials at some point in the next 18 months. Solar cells to synapses The Grutter Research Group at McGill University in Montreal is also active on the tiny technology front. The group, led by Peter Grutter, chairman of McGill's Department of Physics, invents, designs, builds and modifies atomic force microscopes and similar equipment that can be used to manipulate and study matter on a micro or nano scale. These studies have a wide range of applications, such as looking at how organic systems convert light to electricity (useful in solar power cells), or understanding the properties of atomic scale contacts (potentially useful in smaller electronic devices in the future, or in the future production of new materials, such as in advanced car engines), Dr. Grutter says. The group also studies quantum-level processes (with potential applications in quantum computing) and seeks to understand how lithium ions diffuse in batteries (important in getting lithium batteries to charge faster). In another field, the group looks at how connections, or synapses, in neurons are formed and can be artificially manipulated (which could be relevant for treating neuro-degenerative diseases). McGill also runs Nanotools Microfab, a facility where academics and researchers of all stripes can experiment with micro- and nanotechnology (the Toronto Nanofabrication Centre at the U of T serves a similar purpose) and build devices related to their research. "Essentially, it's a 21st century tool shop allowing you to machine structures as small as a few nanometres in size," Dr. Grutter says. The University of Waterloo Nanorobotics Group in Waterloo, Ont., is experimenting with levitating tiny robots. They are using a process called "quantum locking," where a superconductor tends to stay in place when exposed to a magnetic field. For example, if you tilt the superconductor at a 45-degree angle while it hovers in the air, it will remain at a 45-degree angle. This allows the group's microrobot, MAYA, to turn or levitate. "We have several projects that could have commercial applications," says Ryan Kearns, project director at the Nanorobotics Group, which is affiliated with the Mechanical and Mechatronics Engineering department. For instance, if the MAYA project is successful, it could have applications in the micro-fabrication industry, where the tiny robots would allow the precise manipulation of microscopic parts, he says. This would also open the door to building more complicated micro-machines. No sci-fi film Still, researchers warn against unrealistic expectations; we're a long way off from Fantastic Voyage, the 1960s science-fiction film in which a miniaturized submarine and crew are injected in a human body to root out a troublesome blood clot. "I am very optimistic about the use of microrobots in medicine. However, they will have specific and limited applications, such as targeted drug delivery, cauterizing small veins, opening blocked arteries, etc. Any 'inject and forget' nano-robot is still many, many years from being possible," Mr. Kearns says.
<urn:uuid:ecc2c65d-a7af-45d1-a175-fb202b8740d5>
CC-MAIN-2019-26
https://www.theglobeandmail.com/news/national/education/how-tiny-robots-could-help-make-babies/article17221573/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998369.29/warc/CC-MAIN-20190617022938-20190617044938-00162.warc.gz
en
0.936021
1,103
3.59375
4
The debris from a dwarf galaxy surrounded in dark matter is now moving through Earth at an insane speed of 500 km/second. Scientists are equating it to a hurricane due to its velocity, yet are saying it will have no impact on our planet. What excites them and why this is such big news is because they believe this will be a prime opportunity to finally observe the hypothetical substance. Back in 2017, astronomers detected a stretched-out line of stars all moving in the same direction along an elliptical path through our region of the Milky Way. Known as the S1 stream, scientists believe this collection of stars to be the remnants of a dwarf galaxy that was shredded by our Milky Way billions of years ago. Through they have detected roughly 30 other streams, this one is of particular interest because its path crosses that of our Sun. What is Dark Matter/Energy? Dark matter/energy is a theory and has never technically been directly observed by anyone. They know more about what it isn’t rather than what it is. It is a term given to the unseen space between solar systems and galaxies that seems to hold everything in place – the unseen glue that makes up an estimated 85% of the universe. It is called dark matter/energy because it does not seem to interact with any observable electromagnetic radiation, such as light – that is to say, any form of electromagnetic radiation that we know of. Whatever it is, it is not on our known spectrum. Our only way of looking at it is to observe its effects. They can see light bend from the gravitational force of invisible objects. They are observing stars moving faster than they should be. They know there is an invisible matter or form of energy contributing to the mass and rotation rate of galaxies and solar systems. Simply put, dark matter is the term given to the invisible energy that exists within space holding it all together. If it were to be seen, it would look like this: If you think about it, dark matter seems to perfectly resemble the neural webbing of a human brain: Related Article: Physicists Find Evidence That The Universe Is A Giant Brain Dark Matter Seems to Be the Mind of the Universe Scientists are saying nothing will happen to the planet when it passes through because they believe dark matter passes through everything: visible matter, itself, every observed particle, etc. Yet, studies have shown that dark matter may, in fact, interact with light particles to form visible glowing halos. Neutrinos and dark matter have been observed displaying weak interactions. Around each planet, star, solar system, and galaxy scientists are now detecting dark matter halos. Dark matter is also believed to exist within the human brain. When neurons of the brain fire up, radioactive potassium isotopes emit neutrinos. If neutrinos have been observed to interact with dark matter in space, then it is possible these neutrinos within our minds would do the same. Just as halos have been observed around solar systems and galaxies, so to have electromagnetic fields of energy known as auras been observed around humans. Anything that is conscious is emitting a detectable electromagnetic field – a halo if you will. Dark matter is an energy we have yet to figure out how to observe because it does not exist within our known electromagnetic spectrum, yet it appears to be fundamental in maintaining the structure of the universe. It is, therefore, possible that dark matter may indeed be the energy of consciousness. What is Actually Hurtling Past Us? Since dark matter cannot be observed, how do they know a mass quantity of it is rocketing through our solar system? All that can be seen are the stream of stars moving in the same direction along an elliptical path – another solar system if you will. So in simple terms, by announcing a hurricane of dark matter is moving through our solar system, they are saying a collection of stars or planets is now passing through. Could scientists have announced the arrival of the infamous Planet X and its solar system of planets? Those who have been following this theory believe Planet X to be the architect of worlds: a dwarf star with seven planets in tow. The Sumerian tablets describe this planet as being the home of a race known as the Annunaki who genetically recreated humans long ago. The Nag Hammadi scriptures describe them as Archons – cyborgs that are being led by an ancient form of artificial intelligence. If dark energy is indeed the energy of consciousness, then perhaps what is passing through our planet is the energy or consciousness of that solar system; our ancient history – and it is absolutely affecting our planet. This could explain why we are seeing: - The tilt of our Sun being off by 6 degrees as if something is pulling on it - Not only is Earth heating up, but so are all of the other planets within our solar system. The brightness of other planets is increasing - Earth’s magnetic field is beginning to weaken - Earth’s frequency of 7.83 Hz, known as Schumann’s Resonance is suddenly spiking between 12 – 16 Hz - Major world changes are now taking place, such as Earth’s pole shifting, increase in earthquakes and floods - An uptick in asteroids and meteors entering our atmosphere - Increase in UFO appearances, such as the recent Ireland sightings - Telescopes shutting down, such as Hubble and Chandra Most importantly, a mass awakening is occurring. The illusions pulled over our eyes from the corrupt institutions that rule this planet are dissipating and we are beginning to see the world as it truly is. Related Article: 5 Signs We Are Going Through A Global Mass Awakening Now here is where things get spooky. The leading dark matter/energy detector is CERN. CERN and Google have just partnered up: “Together, we will be working to explore possibilities for joint research and development projects in cloud computing, machine learning, and quantum (artificial intelligence) computing.” For those who are unfamiliar with CERN, the large hadron particle collider located in Geneva, Switzerland, known for smashing particles together, as well as creating and storing anti-matter, here is what you might not know: - A large portion of CERN is located in the territory of Saint Genis Pouilly. In Roman times, it was called Apolliacum. The town and temple were dedicated to Apollyon – the destroyer (Shiva/Horus). A 2m tall statue of Shiva sits near the main building. - A video of a human sacrifice taking place in front of the giant Shiva statue surfaced in 2016. - The main dipoles generate magnetic fields that are 100,000 more powerful than the Earth’s magnetic field. - Otto Rossler, a German professor at the University of Tubingen, filed a lawsuit against CERN with the European Court of Human Rights, on the grounds that the facility could trigger a mini black hole that could get out of control and annihilate the planet. The Court tossed out Rossler’s request. - Sergio Bertolucci, former Director for Research and Scientific Computing of the facility, grabbed headlines when he told a British tabloid the supercollider could open otherworldly doors to another dimension for “a very tiny lapse of time,” mere fractions of a second. However, that may be just enough time “to peer into this open door, either by getting something out of it or sending something into it.” This is what happens above CERN when they are busy colliding particles: Now the leading company in artificial intelligence and quantum computing is partnering up with a facility that either has or is attempting to access other dimensions – just as they announce the arrival of a dark matter hurricane/solar system of stars/planets that ancient texts describe as cyborgs under the control of artificial intelligence. Have Google and CERN partnered to welcome the arrival of the Old Gods? Get involved in a truly independent media platform. Freedom.social is designed for truth seekers and activists of all types, where you get paid in 1776 tokens to participate and take action. Referral link: https://freedom.social/JustinD-register
<urn:uuid:486ecebd-d3ab-42e1-ac62-c2faf7bfe9f6>
CC-MAIN-2019-26
https://perc360.com/a-dark-matter-hurricane-is-now-passing-through-earth-what-are-they-not-telling-us-360/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999066.12/warc/CC-MAIN-20190619224436-20190620010436-00121.warc.gz
en
0.947728
1,685
3.546875
4
Devin Powell, Science News, via Tech News, Discovery News (March 28, 2011) " * By manipulating atoms inside diamonds, scientists have developed a new way to store information. " * The technique could lead to quantum computers capable of solving problems beyond the reach of today's technology. "Could be that diamonds are a geek's best friend. "Scientists have developed a new way to manipulate atoms inside diamond crystals so that they store information long enough to function as quantum memory, which encodes information not as the 0s and 1s crunched by conventional computers but in states that are both 0 and 1 at the same time. Physicists use such quantum data to send information securely, and hope to eventually build quantum computers capable of solving problems beyond the reach of today's technology. "For those developing this quantum memory, the perfect diamonds don't come from Tiffany & Co. -- or Harry Winston, for that matter. Impurities are the key to the technology. " 'Oddly enough, perfection may not be the way to go,' said David Awschalom of the University of California, Santa Barbara. 'We want to build in defects.'... The article implies that the defects - anomalies in the diamond crystal's lattice - would probably involve nitrogen, a frequently-found impurity in diamonds. The non-carbon atoms are important because - "...Several years ago, scientists learned how to change the spin of such electrons using microwave energy and put them to work as quantum bits, or qubits...." The new technique links the spin of an electron to a nitrogen atom's nucleus. The transfer involves magnetic fields, and it's fast: "...about 100 nanoseconds, comparable to how long it takes to store information on a stick of RAM." Back to the article, again: "...The technique has 'a fidelity of 85 to 95 percent,' Awschalom said March 22 in Dallas at a meeting for the American Physical Society. "In contrast to some other quantum systems under development, which require temperatures close to absolute zero, this diamond memory works at room temperature. The spins inside the diamond can be both changed and measured by shining laser light into the diamond. This could make diamond an attractive material for scientists developing nanophotonic systems designed to move and store information in packets of light. "Unlike a diamond itself, this quantum memory isn't forever. But it lasts for a very long time by quantum standards. The nuclear spin remains coherent for more than a millisecond, with the potential to improve to seconds.... "...Sebastian Loth, a physicist at IBM's Almaden Research Center in San Jose, Calif. [said], 'If you have a lifetime of milliseconds, that lets you do millions of operations.' "In addition to stability, diamond may also overcome another hurdle that has faced quantum computing -- it can be scaled up to larger sizes. In a paper published last year in Nano Letters, Awschalom developed a technique for creating customizable patterns of nitrogen atoms inside a diamond, using lasers to implant thousands of atoms in a grid...." A thousand atoms in a grid is impressive: but the scaling doesn't, apparently, stop there. Transmitting quantum information is possible, by connecting/entangling qubits. Problem is, entanglement seems to work up to a distance of kilometers: which is huge on the atomic scale, but pretty much useless for a network that extends much beyond one city. That may not be such a serious limitation, though. The article ends with this: "...Quantum repeaters could potentially use small chips of diamond to catch, store and retransmit this information to extend the range, enabling quantum networks to work over much longer distances." The principle sounds pretty much like the way we transmit radio and television signals today - and that's almost another topic. For an article with phrases like "fidelity of 85 to 95 percent" that says why some folks are interested in which way electrons spin - it's pretty interesting. In the Lemming's opinion. Your experience may vary. The Still Sell Vacuum TubesThe Lemming checked: and sure enough, some outfits are still selling vacuum tubes. That's not very surprising. Old technologies are sometimes useful for particular situations - or someone may just like doing things the old-fashioned way. One of the Lemming's extended family was a flint knapper - and that's another topic. It's been an exciting half-century. The Lemming remembers when it was obvious that computers would be huge things, occupying entire buildings and consuming vast amounts of power. Then the transistor stopped being a laboratory curiosity, and started being part of little boxes attached to the ears of adolescents. This hasn't, the Lemming suspects, been a particularly comfortable era for folks who'd just as soon that their great-grandfather's way of life be indistinguishable from their own - and for whom "innovation" is the reckless practice of trying a new sort of food. Or wearing a shirt of a different color. And the Lemming's getting off-topic again. - "Good News, Neural Devices Connect Brain, Computers: Bad News, Same Thing" (July 11, 2009) - "Nanotechnolgy and Electronics: Atom-Sized Transistors ('Nanotronics'??)" (February 20, 2009) - "Programmable Metallization Cell (PMC): One Terabyte of Data in a Little Package" (June 20, 2008) - "More About the Marvelous Memristor" (May 1, 2008) - "Oral Tradition; Writing; Movable Type; Internet - Exciting Times!" (July 31, 2007)
<urn:uuid:b03a3761-8016-433f-90de-fe48b93e1fab>
CC-MAIN-2019-26
https://apatheticlemming.blogspot.com/2011/03/quantum-entanglement-diamonds-and-new.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998600.48/warc/CC-MAIN-20190618003227-20190618025227-00007.warc.gz
en
0.925432
1,182
3.5
4
If you are flipping coins, and you want to turn up five heads at the same time, how would you go about it? You could take five coins and keep flipping them until the odds finally work in your favour. Or you could flip a lot of coins at once, and only count the ones that turn up heads. That second idea is the basic approach to scattershot boson sampling, considered a potential precursor to quantum computing, and Perimeter Institute postdoctoral researcher Daniel Brod is part of a team that just showed it can work, in a paper published in the new web-based journal Science Advances on Friday. A boson sampler is essentially a simplified quantum computation device that uses bosons (in this case, photons) to carry out a specific task. No one yet knows if boson sampling has any practical application, but it is considered a good test case for quantum computation because photon behaviour inside the sampler is expected to be hard to simulate classically. In boson sampling, photons are sent into an interferometer made up of an array of beam splitters. Interferometers can be as big as a room, but boson sampling experiments often use chips as small as microscope slides, with a network of optical fibres etched into the glass. The photons follow along the fibres and, when two fibres come sufficiently close, there is a chance that the photon will “jump” from one to the other. These close-set fibres effectively acts as a beam splitter. Each time a photon passes through a beam splitter, it moves along two directions in quantum superposition. A measurement at the output ports reveals the balance between constructive and destructive interference that the photon experienced along the way. There is a significant challenge to boson sampling, though: we don’t yet have a simple way to generate identical photons on demand. Experiments rely on a technique called parametric down-conversion (PDC). PDC sources shine a laser through a non-linear crystal, and some of the laser’s photons are converted into new pairs of photons that split off in opposite directions. For boson sampling, one photon from the new pair shoots into the sampling device; the other flows into a separate collector that alerts the scientists to the photon’s existence (this is called “heralding”). The problem is, you cannot control when these new, paired photons will appear. The result is somewhat similar to the coin-flipping example from above: three PDC sources will eventually generate three identical photons, but you can’t control when that will happen. If you want to create 30 photons at once (enough to run an experiment that is too hard for a classical computer to simulate), you could be waiting such a long time – weeks, months, or longer – that it renders the experiment void. This limitation was a major obstacle for boson sampling. Then, a new idea was floated in 2013: why not take a scattershot approach? By connecting several PDC sources to the interferometer, but only collecting data when they produced the photons you needed, scientists could, in essence, flip more coins. The idea of taking a scattershot approach was explored on the blog of MIT professor Scott Aaronson (who proposed the original boson sampling mode in 2010), with the idea credited to Steven Kolthammer in Oxford, and came on the heels of a similar idea proposed by researchers at the University of Queensland. Using this scattershot approach, a collaboration including Brod and theorists and experimentalists from Perimeter, Brazil, Rome, and Milan has made another significant advance. In experiments led by Fabio Sciarrino at the Quantum Optics group at the Sapienza University of Rome, the team performed boson sampling in a chip with 13 input ports and 13 output ports, connected by pathways in the chip. The experiment called for three photons, so the team connected six PDC sources to the sampler’s input ports. Two photons would come from the first PDC source. The third photon could come from any of the other five sources. (Watch an animation of the experiment here.) Whenever three photons were generated at the input ports, the team collected the corresponding output data showing from where the photons emerged. (Since the photons are identical, it is impossible to know which incoming photon ends up where.) This is what makes the computation fundamentally hard, Brod says: “If we could know which path a photon followed, or if we could distinguish them (by their frequency or polarization, for example), the whole thing would be easy to simulate classically.” While this did not increase the number of photons being used in a boson sampling experiment, the scattershot approach collected data 4.5 times faster. “The first time we did experiments with three photons, I think it took over a week to collect the data. That is very slow,” Brod says. “An almost fivefold improvement is good, but this approach promises exponentially larger improvements as the experiments scale up.” The improvement also addresses one of the big problems facing boson sampling: the reliance on PDC photon sources. The team showed that, even if you can’t create photons on demand, with enough sources working in a scattershot manner, you can get the desired number of photons to run your experiment. There are still many other challenges ahead, including ways of finding out if the device is working like it’s supposed to. After all, if no classical computer can reproduce the results, “how do you know that your data really corresponds to anything interesting?” Brod says. “This is one of the most important open questions from the theoretical side. We are going to need more theoretical advances before we can say something concrete about the computational tasks these systems are performing.” Back in 2011, when boson sampling came to the fore, Brod was a curious PhD student in Brazil who became engrossed in a lecture Scott Aaronson gave at Perimeter that was webcast on PIRSA. Now, Brod is a member of one of four experimentalist teams around the world actively pushing the idea forward. There’s also a lot of theoretical work yet to do, but that is fine by him. “I find this whole boson sampling idea very elegant – the way that it seems, at least, to connect some concepts of computer science to fundamental physics, the very fundamental properties of identical particles. I think that is a very elegant connection that I would definitely like to understand better.” – Tenille Bonoguore Bosons are fundamental particles that can occupy the same quantum state, and can be elementary, like photons, or composite, like mesons. Superposition is the counter-intuitive quantum phenomenon where a particle can display the wave-like behaviour of being “spread out” through various points in space, whilst retaining the particle-like property that it can only be measured at specific locations. In this case, it’s as if the photon, after passing through a beam splitter, propagates along the two outward directions at once. The caveat is that, when it reaches a detector, it is found in only one of those two directions with corresponding probabilities. This is the so-called collapse of the wave function. Watch an earlier boson sampling experiment carried out by the Sapienza University of Rome team as part of this collaboration.
<urn:uuid:77980379-2fe8-467a-aa31-a0e01f0c6303>
CC-MAIN-2019-26
https://insidetheperimeter.ca/want-quantum-boost-play-odds/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998716.67/warc/CC-MAIN-20190618103358-20190618125358-00208.warc.gz
en
0.947905
1,561
3.578125
4
Quantum Key Distribution Quantum key distribution is a technique used in the context of quantum cryptography in order to generate a perfectly random key which is shared by a sender and a recipient while making sure that nobody else has a chance to learn about the key, e.g. by intercepting the communication channel used during the process. Basic principles of quantum mechanics are exploited to ensure that. Only if quantum mechanics were to turn out to be a flawed theory (for which there is no reasonable evidence after decades of intense research), it might be possible to break the security of such a communication system. The best known and popular scheme of quantum key distribution is based on the Bennet–Brassard protocol (in short: BB84), which was invented in 1984 . It relies on the no-cloning theorem [3, 4] for non-orthogonal quantum states. For example, it can be implemented using polarization states of single photons. Briefly, the Bennet–Brassard protocol works as follows: - The sender (usually called Alice) sends out a sequence of single photons. For each photon, it randomly chooses one of two possible base states, with one of them having the possible polarization directions up/down and left/right, and the other one polarization directions which are tilted by 45°. In each case, the actual polarization direction is also randomly chosen. - The receiver (called Bob) detects the polarizations of the incoming photons, also randomly choosing the base states. This means that on average half of the photons will be measured with the “wrong” base states, i.e. with states not corresponding to those of the sender. - Later, Alice and Bob use a public (possibly interceptable) communication channel to talk about the states used for each photon (but not on the chosen polarization directions). In this way, they can find out which of the photons were by chance treated with the same base states on both sides. - They then discard all photons with a “wrong” basis, and the others represent a sequence of bits which should be identical for Alice and Bob and should be known only to them, provided that the transmission has not been manipulated by anybody. Whether or not this happened they can test by comparing some number of the obtained bits via the public information channel. If these bits agree, they know that the other ones are also correct and can finally be used for the actual data transmission. A possible eavesdropper (called Eve) would have to detect the photons' polarization directions without knowing the corresponding base states. In those cases where Eve's guess concerning the base states is wrong, Eve obtains random results. If Eve sends out photons with these polarization directions, Bob's results will also be random in cases where Bob's guess was right. This will therefore be detected during the last stage (the bit verification). Quantum mechanics would not allow Eve to do a polarization measurement without projecting the photon state onto the chosen base states, i.e., without altering the photon states. Note that Alice and Bob actually need to carry out secure authentication in order to prevent an interceptor from manipulating their public communications. This also requires some secret key, which at a first glance would seem to lead to a catch-22 situation: you need a secret key in order to generate another secret key. However, authentication requires only a short key, whereas the quantum key distribution scheme can generate a much longer one and is therefore still useful. Some remaining problems are: - Ideally, a perfect single photon source should be used for the sender, but this is difficult to realize. Using strongly attenuated laser pulses which have only the order of one photon per pulse generates some risk that pulses which by chance have more than one photon can be used by Eve to gain some information. However, there are some schemes of privacy amplification to destroy this possible knowledge of Eve at the cost of reducing the number of obtained bits for the key. - Losses in the transmission channel (e.g. an optical fiber) reduce the degree of the required quantum correlations and also create chances for an eavesdropper. However, there are also refinements of the technique (quantum error correction) to deal with this issue, provided that the losses are low enough (at most a few percent of the photons). - The bit rate with which a key is generated is normally fairly low, particularly for large transmission distances. This accordingly limits the bit rate of secure communications. A modified cryptography scheme was suggested in 1991 by Ekert . Here, entangled states are used instead of the randomly chosen measurement basis. In many respects, this protocol is similar to the BB84 protocol. Some quantum key distribution systems have been demonstrated which promise unconditional security for transmission distances up to a few tens of kilometers, although at least one system has been proven not to be perfectly secure; successful eavesdropping has been demonstrated . It should be possible, however, to eliminate such security loopholes with more careful implementations. Further system refinements should also allow for transmission distances over 100 km. Research is also directed at developing more practical single-photon and correlated photon pair sources, based on, e.g., spontaneous parametric downconversion in χ(2) crystals or spontaneous four-wave mixing in optical fibers. There are already some commercial quantum key distribution systems which can be used by banks, for example. |||C. H. Bennet and G. Brassard, “Quantum Cryptography: Public key distribution and coin tossing”, in Proceedings of the IEEE International Conference on Computers, Systems, and Signal Processing, Bangalore, p. 175 (1984) (Bennet–Brassard protocol)| |||A. Ekert, “Quantum cryptography based on Bell´s theorem”, Phys. Rev. Lett. 67 (6), 661 (1991)| |||W. K. Wooters and W. H. Zurek, “A single quantum cannot be cloned”, Nature 299, 802 (1982) (no-cloning theorem)| |||N. J. Cerf and J. Fiurasek, “Optical quantum cloning – a review”, Prog. Opt. 49, 455 (2006)| |||A. Tanaka et al., “Ultra fast quantum key distribution over a 97 km installed telecom fiber with wavelength division multiplexing clock synchronization”, Opt. Express 16 (15), 11354 (2008)| |||C. Erven et al., “Entangled quantum key distribution over two free-space optical links”, Opt. Express 16 (21), 16840 (2008)| |||A. R. Dixon et al., “Gigahertz decoy quantum key distribution with 1 Mbit/s secure key rate”, Opt. Express 16 (23), 18790 (2008)| |||C. Bonato et al., “Feasibility of satellite quantum key distribution”, New J. Phys. 11, 045017 (2009)| |||D. Stucki et al., “High rate, long-distance quantum key distribution over 250 km of ultra low loss fibres”, New J. Phys. 11, 075003 (2009)| |||I. Gerhardt et al., “Full-field implementation of a perfect eavesdropper on a quantum cryptography system”, Nature Commun. 2, 349 (2011), DOI: 10.1038/ncomms1348| |||H-K. Lo, M. Curty and K. Tamaki, “Secure quantum key distribution” (review paper), Nature Photon. 8, 595 (2014)| |||Q. Zhang et al., “Large scale quantum key distribution: challenges and solutions”, Opt. Express 26 (18), 24260 (2018)| If you like this article, share it with your friends and colleagues, e.g. via social media:
<urn:uuid:4ace4380-43e4-4aa6-87d9-9e5c3920ee88>
CC-MAIN-2019-26
https://www.rp-photonics.com/quantum_key_distribution.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000414.26/warc/CC-MAIN-20190626174622-20190626200622-00170.warc.gz
en
0.912041
1,647
3.515625
4
Imagine if engineers could build a computer to be millions of times faster than anything that exists today, yet so small it’s microscopic. John Preskill, a theoretical physicist at the California Institute of Technology, explains the science behind quantum computing, the next great frontier in computer science. "Science Behind the News" is produced in partnership with the National Science Foundation. Science Behind the News – Quantum Computing ANNE THOMPSON reporting: Whether at home, the office, or in the palms of our hands, computer technology is getting smaller, faster, and more inseparable from our everyday lives. But imagine if engineers could build a computer to be millions of times faster than anything that exists today, yet so small it's microscopic. In October 2012, the Nobel Prize in Physics was awarded to Serge Haroche and David Wineland for their research on a new type of computer that may revolutionize the way information is processed-the quantum computer. Prof. JOHN PRESKILL (California Institute of Technology): It's really a qualitatively different way of encoding, using, processing information than the way we do it in the computers we have today. THOMPSON: Dr. John Preskill is an NSF funded theoretical physicist at the California Institute of Technology, who works in the field of quantum computing. A quantum computer is made up of two or more atoms or electrons, called quantum bits, or "qubits." These qubits, like all atomic particles, operate according to the laws of quantum mechanics. PRESKILL: The word quantum refers to the laws of physics that describe microscopic objects, the laws of physics that hold sway at the scale of individual atoms, single electrons. THOMPSON: While quantum computers sound complex, in reality, the way qubits represent information is the same as in traditional computers-- by using binary digits, or bits, designated as 0's or 1's. Scientists can control how these qubits exchange information from one to another by using the laws of physics to manipulate their state, spin, or vibration. The first method involves isolating two individual atoms and altering their energy state. PRESKILL: We can shine lasers on the atoms and in a controlled way change the state of an atom from say to ground state to some combination of the ground state and the excited state. THOMPSON: Normally an atom's electrons occupy the "ground state", which is the lowest level of energy an electron can occupy. Its configuration is represented on the Periodic Table of the Elements. If an atom's electrons do not match the ground state, then it's considered to be in the "excited state." By manipulating the state of an atom's electrons, scientists can make them represent either the 0 or 1 bit. PRESKILL: And we could store a bit, like we do in digital computers today, by preparing each atom in either its ground state or an excited state. THOMPSON: The second method for building a quantum computer involves controlling the spin of two isolated electrons. This spin can either be up or down, allowing them to also represent either the 0 or 1 bit. PRESKILL: Electrons are like little magnets. And so the electron has a north pole and a south pole. And so we could store just an ordinary bit by saying that the electron's spin, its magnet, is oriented either up or down. THOMPSON: David Wineland received the Nobel Prize for devising a third type of quantum computer, by isolating charged atoms, or ions, in an ion trap. PRESKILL: The trap is like a bowl, and the ion sits at the bottom of the bowl, and it can rock back and forth around the bottom. And we can excite those vibrational modes, depending on whether the atom is in its ground state or its excited state. And that allows the two atoms to talk to one another. THOMPSON: Though today's quantum computers are only a few qubits long, scientists hope they will reach the scale of thousands or even millions of qubits and be able to perform calculations too large and complex for today's traditional computers. Such breakthroughs could spark incredible advances in cybersecurity, medicine, science, and countless other fields. PRESKILL: Probably the most important applications are ones that we just haven't thought of yet. Because quantum computing is a very new idea. THOMPSON: Quantum computing, a new idea that could pave the way for big changes, by operating in very small ways. Mathematics is integral to computers. Most computer processes and functions rely on mathematical principles. The word "computers" is derived from computing, meaning the process of solving a problem mathematically. Large complex calculations (or computing) in engineering and scientific research often require basic calculators and computers. Science Behind the News, Computers, Quantum, Quantum Computers, Computing, Atoms, Electrons, Subatomic, Atomic, Particles, John Preskill, California Institute of Technology, Caltech, Bits, Quantum Bits, Qubits, Nobel Prize, Physics, David Wineland, Serge Haroche, Digits, Information, Data, Ground State, Excited State, Physical, Periodic Table, Elements, Spin, Vibration, Ions, Ion Trap, Electromagnetic, Field, Magnetic Poles, Cybersecurity, Cyber Security, National Science Foundation, NSF
<urn:uuid:8ba820d3-f6a1-4f3c-a28a-9973baa7676c>
CC-MAIN-2019-26
https://www.nbclearn.com/science-behind-the-news/cuecard/63282
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998600.48/warc/CC-MAIN-20190618003227-20190618025227-00013.warc.gz
en
0.91894
1,107
4.3125
4
Since there are four possibilities, her choice of operation represents two bits of classical information. Note that transforming just one bit of an entangled pair means performing the identity transformation on the other bit. Alice then sends her qubit to Bob who must deduce which Bell basis state the qubits are in. Bob first applies a controlled-NOT to the two qubits of the entangled pair. Bob then measures the second qubit. If the measurement returns |0〉, the encoded value was either 0 or 3; otherwise the value was either 1 or 2. Bob now applies H to the first bit and measures that bit (see Table below). This allows him to distinguish between 0 and 3, and 1 and 2, as shown in the table below. In principle, dense coding can permit secure communication: the qubit sent by Alice will only yield the two classical information bits to someone in possession of the entangled partner qubit. But more importantly, it shows why quantum entanglement is an information resource. It reveals a relationship between classical information, qubits, and the information content of quantum entanglement. Teleportation is the ability to transmit the quantum state of a given particle using classical bits and to reconstruct that exact quantum state at the receiver. The no-cloning principle, however, requires that the quantum state of the given particle be necessarily destroyed. Instinctively, one perhaps realizes that teleportation may be realizable by manipulating a pair of entangled particles; if we could impose a specific quantum state on one member of an entangled pair of particles, then we would be instantly imposing a predetermined quantum state on the other member of the entangled pair. The teleportation algorithm is due to Charles Bennett and his team (1993)34. Teleportation of a laser beam consisting of millions of photons was achieved in 1998. In June 2002, an Australian team reported a more robust method of teleporting a laser beam. Teleportation of trapped ions was reported in 200435. In May 2010, a Chinese research group reported that they were able to “teleport” information 16 kilometers36. In July 2017, a Chinese team reported “the first quantum teleportation of independent single-photon qubits from a ground observatory to a low Earth orbit satellite – through an up-link channel – with a distance up to 1400 km”37. This experiment is an important step forward in establishing a global scale quantum internet in the future. In theory, there is no distance limit over which teleportation can be done. But since entanglement is a fragile thing, there are technological hurdles to be overcome. To see how teleportation works, let Alice possess a qubit of unknown state |Ø〉 = a |0〉 + b |1〉. She wishes to send the state of this qubit to Bob through classical channels. In addition, Alice and Bob each possess one qubit of an entangled pair in the state Alice applies the decoding step of dense coding to the qubit ∅ to be transmitted and her half of the entangled pair. The initial state is of which Alice controls the first two qubits and Bob controls the last qubit. She now applies Cnot⊗ I and H⊗ I⊗ I to this state: Alice measures in the Bell basis the first two qubit to get one of |00〉, |01〉, |10〉, or |11〉 with equal probability. That is, Alice’s measurements collapse the state onto one of four different possibilities, and yield two classical bits. Alice sends the result of her measurement as two classical bits to Bob. Depending on the result of the measurement, the quantum state of Bob’s qubit is projected to a(|0〉 + b|1〉, a(|1〉 + b|0〉, a(|0〉- b|1〉, or a(|1〉 -b|0〉 respectively38. Note that Alice’s measurement has irretrievably altered the state of her original qubit ∅ , whose state she is trying to send to Bob. Thus, the no-cloning principle is not violated. Also, Bob’s particle has been put into a definite state. When Bob receives the two classical bits from Alice he knows how the state of his half of the entangled pair compares to the original state of Alice’s qubit. Bob can reconstruct the original state of Alice’s qubit ∅ by applying the appropriate decoder to his part of the entangled pair. Note that this is the encoding step of dense coding. The interesting facts to note are as follows. First, the state that is transmitted is completely arbitrary (not chosen by Alice and unknown to her). Second, a message with only binary classical information, such as the result of the combined experiment made by Alice is definitely not sufficient information to reconstruct a quantum state; in fact, a quantum state depends on continuous parameters, while results of experiments correspond to discrete information only. Somehow, in the teleportation process, binary information has turned into continuous information! The latter, in classical information theory, would correspond to an infinite number of bits. It also happens that Alice cannot determine the state of her particle with state ∅ by making a measurement and communicating the result to Bob because it is impossible to determine the unknown quantum state of a single particle (even if one accepts only an a posteriori determination of a perturbed state); one quantum measurement clearly does not provide sufficient information to reconstruct the whole state, and several measurements will not provide more information, since the first measurement has already collapsed the state of the particle. Note also that without the classical communication step, teleportation does not convey any information at all. The original qubit ends up in one of the computational basis states |0〉 or |1〉 depending on the measurement outcome. Quantum teleportation can be used to move quantum states around, e.g., to shunt information around inside a quantum computer or indeed between quantum computers. Quantum information can be transferred with perfect fidelity, but in the process the original must be destroyed. This might be especially useful if some qubit needs to be kept secret. Using quantum teleportation, a qubit could be passed around without ever being transmitted over an insecure channel. In addition, teleportation inside a quantum computer can be used as a security feature wherein only one version of sensitive data is ensured to exist at any one time in the machine. We need not worry about the original message being stolen after it has been teleported because it no longer exists at the source location. Furthermore, any eavesdropper would have to steal both the entangled particle and the classical particle in order to have any chance of capturing the information. 11. Conclusion of Part II Thus far we have, by a variety of examples, shown the power of quantum computing. In Section 10 we concluded Part II by describing some of the prized algorithms. Readers desirous of getting some hands-on experience with a few of the algorithms presented in Part II can visit https://acc.digital/Quantum_Algorithms/Quantum_Algorithms_June2018.zip . This material is provided by Vikram Menon using the IBM Q Experience for Researchers (see https://quantumexperience.ng.bluemix.net/qx/experience). 34 Bennett, et al (1993). See also: Rieffel & Polak (2000). 35 Riebe, et al (2004). Barrett, et al (2004). 36 Jin, et al (2010). 37 Ren, et al (2017). A report appears in: Emerging Technology from the arXiv. First Object Teleported from Earth to Orbit. MIT Technology Review, 10 July 2017, https://www.technologyreview.com/s/608252/first-object-teleported-from-earthto- orbit of the states a(|0〉 + b|1〉, a(|1〉 + b|0〉, a(|0〉- b|1〉, or a(|1〉 – b|0〉. 38 Note that due to the measurement Alice made, the measured qubits have collapsed. Hence only Bob’s qubit can be in one
<urn:uuid:42b85644-624d-4629-ae97-01f7919bb51d>
CC-MAIN-2019-26
https://acc.digital/the-essence-of-quantum-computing-3/8/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998580.10/warc/CC-MAIN-20190617203228-20190617225228-00135.warc.gz
en
0.909888
1,718
3.5625
4
Jefferson Lab's Free-Electron Laser explores promise of carbon nanotubes Webs of nanotubes form on collector plates during the collaboration's FEL experiment (image not actual size). Jefferson Lab's Free-Electron Laser used to explore the fundamental science of how and why nanotubes form, paying close attention to the atomic and molecular details Scientists and technologists of all stripes are working intensively to explore the possibilities of an extremely strong and versatile cylinder so tiny that millions — which in bunches look like an ebony snowflake — could fit easily on the tip of a pin. The objects in question are known as carbon nanotubes, first discovered in 1991 as the elongated form of an all-carbon molecule. Sometimes called CNTs, nanotubes take up an extremely small space but can connect together materials with different properties, even as their own properties can be adjusted depending on formulation. The tubes' "aspect ratio" is enormous: that is, they are very long but not wide, and like an ultra-strong rope, can be extended without sacrificing strength. CNTs have potential applications in molecular and quantum computing and as components for microelectromechanical sensors, or MEMS. The tubes could also function as a "lab on a chip," with attached microelectronics and components that could detect toxins and nerve agents in vanishingly small concentrations. Nanotubes could also lead to an entirely new generation of materials: as strong or stronger than steel, but very lightweight. CNTs are amazingly damage-tolerant, generally displaying nearly total "elastic recovery," even under high-deformation conditions. If bent, buckled or creased the tubes are usually able to reassume their original shape once external stressors are removed. "Nanotubes take up a very small amount of space but can connect a lot of material together," says Brian Holloway, an assistant professor in the College of William & Mary's Department of Applied Science. "You can imagine replacing metal components with nanotubes that could weigh maybe a tenth as much. One of the big reasons NASA is interested is obviously because of the cost of getting to space." Brian Holloway, William & Mary professor, prepares the nanotube oven, a component that helps produce nanotubes with light from JLab's Free-Electron Laser. A research team led by Holloway is also intrigued by the tubes' potential. Holloway's group has used Jefferson Lab's Free-Electron Laser (FEL) to explore the fundamental science of how and why nanotubes form, paying close attention to the atomic and molecular details. Already, in experiments, the William & Mary/NASA Langley collaboration has produced tubes as good as if not better than those at other laboratories or in industry. The next step will be to increase quantity while holding costs down, which Holloway believes will be possible using the Lab's upgrade of the FEL to 10 kilowatts. "Right now we're interested in making more nanotubes," Holloway says. "The FEL offers a way to efficiently and cost-effectively make large amounts of high-quality tubes. Nanotubes come in a variety of flavors; the thought is we could eventually control what we call 'tube chiralities,' [properties like] structure, length and diameter." The CNT collaboration makes the tubes by striking a metal-impregnated carbon target with FEL light. The laser vaporizes layers of a graphite annulus, essentially a thick ring mounted on a spinning quartz rod. Atoms discharge from the annulus surface, creating a plume, a kind of nanotube "spray." Under the right conditions trillions upon trillions of nanotubes can be so formed within an hour. Conventional means of nanotube production involves a tabletop laser. In this more traditional manufacturing approach, perhaps 10 milligrams — about one-tenth of an aspirin-bottle full — of the tubes can be produced per hour at costs up to $200 per gram. Conversely, with a one-kilowatt FEL, up to two grams per hour, or about 100 times more nanotubes can be made, at a cost of $100 per gram. A 10-kilowatt FEL could radically alter that equation. To that end, Holloway is seeking funding from NASA and the Office of Naval Research for a three-year project whose goal would be to optimize nanotube production with the upgraded FEL in order to manufacture large quantities quickly and cheaply. According to Gwyn Williams, FEL Basic Research Program manager, researchers are anticipating learning much more about the details of the photochemical processes involved in nanotube production once the new FEL comes back on line in 2003. Demand for the tubes is intense and growing. Whoever finds a way to make them reliably and affordably could reap the rewards, financially and otherwise, as commercial interests beat a figurative path to researchers' doors. "A lot of people can make nanotubes. Very few can make grams or kilograms of nanotubes on time scales less than weeks," Holloway points out. "Factors other than price can drive demand. Right now there's no one who could sell you one kilogram of nanotubes per month all of the same quality, at any price." Jefferson Science Associates, LLC, a joint venture of the Southeastern Universities Research Association, Inc. and PAE, manages and operates the Thomas Jefferson National Accelerator Facility, or Jefferson Lab, for the U.S. Department of Energy's Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit https://energy.gov/science.
<urn:uuid:d2112cc9-67f9-4469-96e7-9711d13a7a8f>
CC-MAIN-2019-26
https://www.jlab.org/news/releases/jefferson-labs-free-electron-laser-explores-promise-carbon-nanotubes
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999263.6/warc/CC-MAIN-20190620165805-20190620191805-00096.warc.gz
en
0.936393
1,207
3.640625
4
The big goal: a large quantum computer Solving the puzzles and challenges posed in Quantum Moves will allow the team of physicists in Aarhus University to take an important step towards building a large-scale quantum computer. A quantum computer is a computer built using the some of the smallest physical building blocks, such as single atoms. It operates under the rules of quantum physics, making is very different from the normal computers we are used to encountering. Why all the fuss about quantum computers? There are essentially two answers. The first reason is practical. Moore's law, introduced in 1965, Gordon E. Moore, makes the observation that the computational power of the machines doubles every 18 months. This leads to an ever-decreasing size of fundamental logical units. The fact that this decrease cannot go on indefinitely poses the fundamental challenges that chip manufacturers currently face in achieving more computational power. Very soon the size of a chip will be comparable to a single atom. Why is this such a challenge? The law's of physics change radically when we enter the world of atoms and other small things. In this world, all physical rules are dictated by quantum mechanics. These rules forbid some things that we are used to, such as certain positions of atoms or electrons. On the other hand, they allow things that are impossible in our everyday life, like being in many places at the same time and walking though walls! Shrinking a computer to this size means that the computer is not just small, it's also pretty crazy. Scientists have to word hard to make sure that their new minuscule computers do all the calculations they are supposed to do, despite the quantum rules. The second reason to get excited about quantum computers is exactly all the quirkiness of quantum objects! Used right, all the new features of the quantum computer can be harnessed to make totally different types of computing machines that are much more powerful than ordinary computers. Take the ability of an atom to be in two places simultaneously, or to be in two states at the same time. This is known as the superposition principle and it's pretty much like seeing the head-side and the tail-side of a coin at the same time. It enables the creation of qubits (quantum bits), bits that can be 'zero' and 'one' at the same time. All computations are based on operations on bit strings, long chains of 'zeros' and 'ones'. If we have a string of qubits, we can code gigantic amounts of information into reasonable short chains. This is because a string of, say, three qubits, can be in a state '000' at the same time as it is in a state '001', '011', '111' and so on - in fact, we can express 2^3=8 bit stings in a 3-qubit string. Four qubits can express 2^4=16 bit strings, 10 qubits express a whopping 1024 strings and if we have a qubit string of just 260 qubits, it can contain more information than there are atoms in the universe. This quantum property allows huge parallel computations and means that just one quantum computer would have more computational power than all conventional computers combined! Prototypes of quantum computers are currently developed in physics labs all around the world. The problem they face is the immense difficulty of making them big enough to be really interesting. The quantum objects used to create qubits are badly behaved rascals and rarely do the exact things they are told to do. The best available prototypes can tame strings of about 10-20 qubits, which is not yet enough to unleash the full power of quantum computing. Our contribution in Aarhus Quantum Moves is based on an idea of storing single atoms in a very specific trap, where each atom sits in a well like an egg in an egg tray. With such a trap, physicists can store around 300 atoms in a neat configuration. Just imagine being able to use each of these atoms as a qubit! This is, in fact, possible, and with such a configuration we are getting tantalisingly close to having a quantum computer with the superpowers everyone is fussing over. The missing piece is doing operations and calculations on this large qubit string. Operations are done by picking up individual atoms, moving them around the egg tray trap and merging them with other atoms. This has to be done with great care to preserve the state of the atom (whether it's 'zero, 'one' or a superposition of both) - but also fast, to avoid outside noise and keep calculations efficient. Sound familiar? This is exactly the challenge posed in the Quantum Moves game! Players move and merge atoms with the goal of ending up in very specific final shapes (which are actually just different states). Our engines take this information and translate it to the way a laser beam is used to pick up and move around an atom in the lab. Different challenges in Quantum Moves correspond to different operations on the qubit strings. Once we have a whole toolbox of operations, we are ready to turn on the 300-qubit quantum computer and start addressing the hugely important and impactful task of working on problems that have been "impossible" to solve for the best present-day supercomputers.
<urn:uuid:080866d5-8fa2-43dd-8f64-394dd24af68c>
CC-MAIN-2019-26
https://www.scienceathome.org/games/quantum-moves/scientific-goal/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999946.25/warc/CC-MAIN-20190625192953-20190625214953-00335.warc.gz
en
0.944947
1,082
4
4
Earth emits gravitational waves as it orbits the Sun, though the amount of energy lost is imperceptible over the lifetime of the Solar System. Binary black holes are a different matter: Once they are relatively close, they shed a tremendous amount of energy, bringing them closer together with each orbit. (Binary black stars are thought to emit more gravitational energy as they merge than regular stars emit in the form of UV, IR, and visible light over their entire lifetimes of billions of years.) Eventually their event horizons will touch, and the system emits a lot more gravitational waves in a phase known as “ring-down,” as the lumpy, uneven merged mass becomes a smooth, perfectly symmetrical black hole. [Read more…] (OK, it doesn’t scan. So sue me.) Quantum entanglement is a challenging topic, and one which has tripped up a lot of people (including many physicists!) over the decades. In brief, entanglement involves two (or more) particles constituting a single system: measurement on one particle instantly determines the result of similar measurements on the second, no matter how far they are separated in space. While no information is transferred in this process, it’s still at odds with our everyday experience with how the world should work. I updated my earlier explanation of entanglement, which hopefully can help clear up some of the confusion. Recent work either assumes entanglement is real and probes some of the more interesting implications, or tests some mathematical relations known as Bell’s inequalities. The latter are aimed at quantifying the difference between the predictions of quantum physics and certain alternative models. In that spirit, a group of researchers proposed using light from quasars to randomize the measurement apparatus in entanglement experiments, to eliminate the tiny possibility of a weird loophole in quantum theory. If a detector has some correlation with the hidden variables of the particles being measured, then the two detectors don’t act independently. That’s true even if only a very tiny amount of information is exchanged less than a millisecond before measurements take place. The interaction would create the illusion that the particles are entangled in a quantum sense, when in fact they are influencing the detectors, which in turn dictate what measurements are being taken. This is known as the “detector settings independence” loophole—or somewhat facetiously as the “free will” loophole, since it implies the human experimenter has little or no choice over the detector settings. [Read more…] Final note: this is probably the first paper I’ve covered that involves both my undergraduate research focus (quantum measurement) and my PhD work (cosmology), albeit in a much different way than both. Cassiopeia A (abbreviated Cas A) is a historical oddity. The supernova was relatively close to Earth—a mere 11,000 light-years distant—and should have been visible around CE 1671, yet no astronomers of any culture recorded it. That’s in stark contrast to famous earlier explosions: Tycho’s supernova, Kepler’s supernova, and of course the supernova that made the Crab Nebula. This mysterious absence has led some astronomers to speculate that some unknown mechanism diffused the energy from the explosion, making the supernova far less bright than expected. [Read more…] “Dark energy” is one of the more unfortunate names in science. You’d think it has something to do with dark matter (itself a misnomer), but it has the opposite effect: while dark matter drives the clumping-up of material that makes galaxies, dark energy pushes the expansion of the Universe to greater and greater rates. Though we should hate on the term “dark energy”, we should respect Michael Turner, the excellent cosmologist who coined the phrase. He is also my academic “grand-advisor”: he supervised Arthur Kosowsky’s PhD, and Arthur in turn supervised mine. And of course, I worked on dark energy as a major part of my PhD research. In my latest piece for Slate, I describe a bit of my dysfunctional relationship with cosmic acceleration, and why after 16 years dark energy is still a matter of frustration for many of us. Because dark energy doesn’t correspond easily to anything in the standard toolkit of physics, researchers have been free to be creative. The result is a wealth of ideas, some that are potentially interesting and others that are frankly nuts. Some string theorists propose that our observable universe is the result of a vast set of parallel universes, each with a different, random amount of dark energy. Other physicists think our cosmos is interacting with a parallel universe, and the force between the two drives cosmic acceleration. Still others suspect that dark energy is a sign that our currently accepted theory of gravity—Einstein’s general theory of relativity—is incomplete for the largest distances. [Read more…] For the upcoming ScienceOnline 2014 meeting, I’m leading a session titled “Reporting Incremental Science in a World that wants Big Results“. It’s an important topic. We who communicate science to the general public have to evaluate stories to see if they’re worth covering, then translate them in such a way that conveys their significance without hyping them (ideally at least). That’s challenging to do on deadline, and we’re not always or maybe even usually experts on the topics we report. I know a fair amount about cosmology and gravitational physics, but very little about galactic astronomy or planetary science — yet I must write about them, because it’s my job. So Stephen Hawking’s recent talk on black holes is an interesting case study. I won’t rehash the whole story here, but I wrote not one but two articles on the subject yesterday. Article 1 was in Slate: Hawking’s own thinking about black holes has changed over time. That’s no criticism: Evidence in science often requires us to reassess our thinking. In this case, Hawking originally argued that black holes violated quantum mechanics by destroying information, then backed off from that assertion based on ideas derived from string theory (namely, the holographic principle). Not everyone agrees with his change of heart, though: The more recent model he used doesn’t correspond directly to our reality, and it may not have an analog for the universe we inhabit. The new talk suggests he has now moved on from both earlier ideas. That’s partly what raises doubts in my mind about the “no event horizons” proposal in the online summary. Is this based on our cosmos or yet another imaginary one of the sort physicists are fond of inventing to guide their thinking? In my reading, it’s hard to tell, and in the absence of a full explanation we are free to project our own feelings about both Hawking and his science onto the few details available. [Read more…] Article 2 was a follow-up on my own blog: But at the same time, we have to admit that nobody—not Nature News, not Slate.com—would have covered a paper this preliminary had Hawking’s name not been attached. Other people are working on the same problem (and drawing different conclusions!), but they can’t command space on major science news sites. So, by covering Hawking’s talk, we are back on that treacherous path: we’re showing how science works in a way, but we risk saying that a finding is important because somebody famous is behind it. [Read more…] One new model, proposed by Anastasia Fialkov, Rennan Barkana, and Eli Visbal, suggests that energetic X-rays could have heated the primoridal gas to the point that reionization happened relatively rapidly. That’s in contrast with other hypotheses, which predict a more gradual reionization process. The X-rays in the new model were emitted by systems that include neutron stars or black holes. The nicest feature of the new proposal is that it predicts a unique pattern in light emission from the primordial gas, which could conceivably be measured by current radio telescopes. [Read more….]
<urn:uuid:a80507ea-2a65-401e-b998-550d6d562e6e>
CC-MAIN-2019-26
https://bowlerhatscience.org/2014/02/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628001014.85/warc/CC-MAIN-20190627075525-20190627101525-00256.warc.gz
en
0.946601
1,706
3.546875
4
New nanotechnology findings from physicists at the University of Maryland have moved us significantly closer to a "holy grail" of clean energy research – the efficient, cost effective generation of clean hydrogen fuel from sunlight. The UMD team created a fundamentally new synthesis strategy for hybrid nanostructures that they and other scientists say make possible new nanostructures and nanotechnologies with huge potential applications ranging from clean energy and quantum computing advances to new sensor development. The team demonstrated the power of their method by creating a photocatalyst that is almost 15 times more efficient in using solar energy to split water (H2O) into hydrogen and oxygen than conventional photocatalysts. Photocatalysts are substances that use light to boost chemical reactions. Chlorophyll is a natural photocatalyst used by plants. "The ingenious nano-assemblies that Professor Ouyang and his collaborators have fabricated, which include the novel feature of a silver-gold particle that super-efficiently harvests light, bring us a giant step nearer to the so-far elusive goal of artificial photosynthesis: using sunlight to transform water and carbon dioxide into fuels and valuable chemicals," says Professor Martin Moskovits of the University of California at Santa Barbara, a recognized expert in this area of research and not affiliated with the paper. Lighting the Way to Clean, Efficient Power Hydrogen fuel cell has long been considered a tremendously promising, clean alternative to gasoline and other carbon based (fossil) fuels that are currently used for cars, electrical generation and most other energy applications. A fuel cell combines stored hydrogen gas with oxygen from the air to produce electricity that can power vehicles, homes and businesses. The only byproduct of hydrogen fuel cells is water. Combustion of gasoline and other carbon-based fuels emit pollutants, including carbon dioxide, the principle greenhouse gas contributing to climate change. It's expected that in 2015, American consumers will finally be able to purchase fuel cell cars from Toyota and other manufacturers. Although these will be zero-emissions vehicles, most of the hydrogen fuel to power them currently is made from natural gas, a fossil fuel that contributes to climate change and increasingly is being produced by the controversial process known as fracking. The cleanest way to produce hydrogen fuel is using solar energy to split water into hydrogen and oxygen. However, decades of research advances have not yielded photocatalytic methods with sufficient energy efficiency to be cost effective for use in large scale water splitting applications. Efficient creation of hydrogen fuel from sunlight is also critical to development of large scale solar energy plants because hydrogen fuel is an ideal way to store for later use, the energy generated by such facilities. The UMD team's work advances the efficiency of photocatalysts and lays the foundation for much larger future advances by more fully realizing a light-generated nanoparticle effect first used by ancient Romans to create glass that changes color based on light. This effect, known as surface plasmon resonance, involves the generation of high energy electrons using light. UMD team leader Min Ouyang, an associate professor in the department of physics and the Maryland NanoCenter., explains that plasmon resonance is the generation of a collective oscillation of low energy electrons by light. The light energy stored in such a "plasmonic oscillator" then can be converted to energetic carriers (i.e., "hot" electrons) for use in photocatalysis and many other applications. "Using our new modular synthesis strategy, our UMD team created an optimally designed, plasmon-mediated photocatalytic nanostructure that is an almost 15 times more efficient than conventional photocatalysts," says Ouyang. In studying this new photocatalyst, Min and his colleagues identified a previously unknown "hot plasmon electron-driven photocatalysis mechanism with an identified electron transfer pathway." It is this new mechanism that makes possible the high efficiency of the UMD team's new photocatalyst. And it is a finding made possible by the precise materials control allowed by the team's new general synthesis method. The UMD team says their findings hold great promise for future advances to make water splitting cost effective for large-scale use in creating hydrogen fuel. And the team's newly-discovered mechanism for creating hot (high energy) electrons should also be applicable to research involving other photo-excitation processes. A Fundamental Nanotechnology Advance The findings of Min and his colleagues were published recently in Nature Communications. Their primary discovery is a fundamentally new synthesis strategy for hybrid nanostructures that uses a connector, or "intermedium," nanoparticle to join multiple different nanoparticles into nanostructures that would be very difficult or perhaps even impossible to make with existing methods. The resultant mix and match modular component approach avoids the limitations in material choice and nanostructure size, shape and symmetry that are inherent in the crystalline growth (epitaxial) synthesis approaches currently used to build nanostructures. "Our approach makes it possible to design and build higher order [more complex and materially varied] nanostructures with a specifically designed symmetry or shape, akin to the body's ability to make different protein oligomers each with a specific function determined by its specific composition and shape," says Ouyang. "Such a synthesis method is the dream of many scientists in our field and we expect researchers now will use our approach to fabricate a full class of new nanoscale hybrid structures," he says. One of the many scientists excited about the new UMD method is the University of Delaware's Matt Doty, an associate professor of materials science and engineering, physics, and electrical and computer engineering and associate director of the UD Nanofabrication Facility. "The work of Weng and coauthors provides a powerful new tool for the 'quantum engineering' of complex nanostructures designed to implement novel electronic and optoelectronic functions. [Their] new approach makes it feasible for researchers to realize much more sophisticated nanostructure p designs than were previously possible." he says. Support for this research was provided by the Office of Naval Research, the U.S. Department of Energy, the National Science Foundation, and the Research Corporation for Science Advancement. Hierarchical synthesis of non-centrosymmetric hybrid nanostructures and enabled plasmon-driven photocatalysis, Lin Weng, HuiZhang, Alexander O. Govorov and Min Ouyang. Nature Communications; Article number: 4792; doi:10.1038/ncomms5792 September 15, 2014 Hydrogen & Solar Power Boosted by New Ability to Shape Nanostructures Did You Know UMD's Neutral Buoyancy Research Facility, which simulates weightlessness, is one of only two such facilities in the U.S.
<urn:uuid:aafc220c-cd74-412d-ac1c-aa429b915bca>
CC-MAIN-2019-26
http://techtransfer.umd.edu/news/news_story.php?id=8522
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998716.67/warc/CC-MAIN-20190618103358-20190618125358-00216.warc.gz
en
0.933888
1,406
3.671875
4
Seth Lloyd, a professor of mechanical engineering at MIT, is among the pioneers of quantum computing: he proposed the first technologically feasible design for a quantum computer. If humans ever build a useful, general-purpose quantum computer, it will owe much to Lloyd. Earlier this year, he published a popular introduction to quantum theory and computing, titled Programming the Universe, which advanced the startling thesis that the universe is itself a quantum computer. Technology Review:In your new book, you are admirably explicit: you write, “The Universe is indistinguishable from a quantum computer.” How can that be true? Seth Lloyd: I know it sounds crazy. I feel apologetic when I say it. And people who have reviewed the book take it as a metaphor. But it’s factually the case. We couldn’t build quantum computers unless the universe were quantum and computing. We can build such machines because the universe is storing and processing information in the quantum realm. When we build quantum computers, we’re hijacking that underlying computation in order to make it do things we want: little and/or/not calculations. We’re hacking into the universe. TR: Your critics can be forgiven for thinking you wrote metaphorically. In every era, scientists have likened the universe to the most complicated technology they knew. Newton thought the universe was like a clock. SL: You could be more blunt: “Lloyd builds quantum computers; therefore, Lloyd thinks the universe is a quantum computer.” But I think that’s unfair. TR: You famously believe in “it from bit”: that is, that information is a physical property of the universe, and that information generates more-complex information – and with it, all the phenomenal world. SL: Imagine the electron, which an ordinary computer uses to store data. How can it have information associated with it? The electron can be either here or there. So it registers a bit of information, one of two possibilities: on or off. TR: Sure, but how does the quantity of information increase? SL: If you’re looking for places where the laws of physics allow for information to be injected into the universe, then you must look to quantum mechanics. Quantum mechanics has a process called “decoherence” – which takes place during measurement, for instance. A qubit [or quantum bit] that was, weirdly, both here and there is suddenly here or there. Information has been added to the universe. TR: And why does the universe tend to complexity? SL: This notion of the universe as a giant quantum computer gets you something new and important that you don’t get from the ordinary laws of physics. If you look back 13.8 billion years to the beginning of the universe, the Initial State was extremely simple, only requiring a few bits to describe. But I see on your table an intricate, very beautiful orchid – where the heck did all that complex information come from? The laws of physics are silent on this issue. They have no explanation. They do not encode some yearning for complexity. TR: [Utterly bemused] Hmmm … SL: Could the universe have arisen from total randomness? No. If we imagine that every elementary particle was a monkey typing since time began at the maximum speed allowed by the laws of physics, the longest stretch of Hamlet that could have been generated is something like “To be or not to be, that is the – .” But imagine monkeys typing at computers that recognize the random gibberish as a program. Algorithmic information theory shows that there are short, random-looking programs that can cause a computer to write down all the laws of physics. So for the universe to be complex, you need random generation, and you need something to process that information according to a few simple rules: in other words, a quantum computer. TR: More practically: how far are we from widely used, commercial applications of quantum computing? SL: Today, the largest general-purpose quantum computer is only a dozen bits. So we’re at least a decade or two away. But we’ve already built quantum computers that simulate other quantum systems: you could call them quantum analog computers. These little machines can perform computations that would require an ordinary computer larger than the universe. TR: What’s the next big thing that needs to be done in quantum computing? SL: From the techno-geek, experimentalist point of view, it’s the pacification of the microscopic, quantum world. It’s the Wild West down there. TR: Programming the Universe concludes with a personal note. You describe how your friend Heinz Pagels, a renowned physicist, fell to his death while hiking with you in Colorado. You find some consolation in your theory of universal quantum computation: “But we have not entirely lost him. While he lived, Heinz programmed his own piece of the universe. The resulting computation unfolds in us and around us …” SL: Well, it’s pretty poor consolation when someone you love is dead. But it’s a truer consolation than the idea that one day you might meet him in heaven.
<urn:uuid:ce3f78d0-e0e9-4023-96ce-974dc96ef362>
CC-MAIN-2019-26
https://www.technologyreview.com/s/406035/qa-seth-lloyd/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999817.30/warc/CC-MAIN-20190625092324-20190625114324-00137.warc.gz
en
0.933903
1,094
3.578125
4
A new technique for quantum computing could bust open our whole model of how time moves in the universe. Here's what's long seemed to be true: Time works in one direction. The other direction? Not so much. That's true in life. (Tuesday rolls into Wednesday, 2018 into 2019, youth into old age.) And it's true in a classical computer. What does that mean? It's much easier for a bit of software running on your laptop to predict how a complex system will move and develop in the future than it is to recreate its past. A property of the universe that theorists call "causal asymmetry" demands that it takes much more information — and much more complex calculations — to move in one direction through time than it does to move in the other. (Practically speaking, going forward in time is easier.) This has real-life consequences. Meteorologists can do a reasonably good job of predicting whether it will rain in five days based on today's weather radar data. But ask the same meteorologists to figure out whether it rained five days ago using today's radar images? That's a much more challenging task, requiring a lot more data and much bigger computers. [The 18 Biggest Unsolved Mysteries in Physics] Information theorists suspected for a long time that causal asymmetry might be a fundamental feature of the universe. As long ago as 1927, the physicist Arthur Eddington argued that this asymmetry is the reason we only move forward through time, and never backward. If you understand the universe as a giant computer constantly calculating its way through time, it's always easier — less resource-intensive — for things to flow forward (cause, then effect) than backward (effect, then cause). This idea is called the "arrow of time." But a new paper, published July 18 in the journal Physical Review X, opens the door to the possibility that that arrow is an artifact of classical-style computation — something that's only appeared to us to be the case because of our limited tools. A team of researchers found that in certain circumstances causal asymmetry disappears inside quantum computers, which calculate in an entirely different way— Unlike classical computers in which information is stored in one of two states (1 or 0), with quantum computers, information is stored in subatomic particles that follow some bizarre rules and so can each can be in more than one state at the same time. And, even more enticingly, their paper points the way toward future research that could show causal asymmetry doesn't really exist in the universe at all. Very orderly and very random systems are easy to predict. (Think of a pendulum — ordered — or a cloud of gas filling a room — disordered.) In this paper, the researchers looked at physical systems that had a goldilocks' level of disorder and randomness — not too little, and not too much. (So, something like a developing weather system.) These are very difficult for computers to understand, said study co-author Jayne Thompson, a complexity theorist and physicist studying quantum information at the National University of Singapore. [Wacky Physics: The Coolest Little Particles in Nature] Next, they tried to figure out those systems' pasts and futures using theoretical quantum computers (no physical computers involved). Not only did these models of quantum computers use less memory than the classical computer models, she said, they were able to run in either direction through time without using up extra memory. In other words, the quantum modelshad no causal asymmetry. "While classically, it might be impossible for the process to go in one of the directions [through time]," Thompson told Live Science, "our results show that 'quantum mechanically,' the process can go in either direction using very little memory." And if that's true inside a quantum computer, that's true in the universe, she said. Quantum physics is the study of the strange probabilistic behaviors of very small particles — all the very small particles in the universe. And if quantum physics is true for all the pieces that make up the universe, it's true for the universe itself, even if some of its weirder effects aren't always obvious to us. So if a quantum computer can operate without causal asymmetry, then so can the universe. Of course, seeing a series of proofs about how quantum computers will one day work isn't the same thing as seeing the effect in the real world. But we're still a long way off from quantum computers advanced enough to run the kind of models this paper describes, they said. What's more, Thompson said, this research doesn't prove that there isn't any causal asymmetry anywhere in the universe. She and her colleagues showed there is no asymmetry in a handful of systems. But it's possible, she said, that there are some very bare-bones quantum models where some causal asymmetry emerges. "I'm agnostic on that point," she said. The next step for this research, she said, is to answer that question — to figure out whether causal asymmetry exists in any quantum models. This paper doesn't prove that time doesn’t exist, or that we’ll one day be able to slip backward through it. But it does appear to show that one of the key building blocks of our understanding of time, cause and effect, doesn't always work in the way scientists have long assumed — and might not work that way at all. What that means for the shape of time, and for the rest of us, is still something of an open question. The real practical benefit of this work, she said, is that way down the road quantum computers might be capable of easily running simulations of things (like the weather) in either direction through time, without serious difficulty. That would be a sea change from the current classical-modeling world. Originally published on Live Science.
<urn:uuid:031935e3-cb79-4ee6-bf10-e99612aa7991>
CC-MAIN-2019-26
https://www.livescience.com/63182-quantum-computer-reverse-arrow-time.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000613.45/warc/CC-MAIN-20190627035307-20190627061307-00375.warc.gz
en
0.951629
1,208
3.84375
4
Computers learn to imagine the future by Garrett Kenyon In many ways, the human brain is still the best computer around. For one, it’s highly efficient. Our largest supercomputers require millions of watts, enough to power a small town, but the human brain uses approximately the same energy as a 20-watt bulb. While teenagers may seem to take forever to learn what their parents regard as basic life skills, humans and other animals are also capable of learning very quickly. Most of all, the brain is truly great at sorting through torrents of data to find the relevant information to act on. At an early age, humans can reliably perform feats such as distinguishing an ostrich from a school bus, for instance – an achievement that seems simple, but illustrates the kind a task that even our most powerful computer vision systems can get wrong. We can also tell a moving car from the static background and predict where the car will be in the next half-second. Challenges like these, and far more complex ones, expose the limitations in our ability to make computers think like people do. But recent research at Los Alamos National Laboratory is changing all that. Brain neuroscientists and computer scientists call this field neuromimetic computing – building computers inspired by how the cerebral cortex works. The cerebral cortex relies on billions of small biological “processors” called neurons. They store and process information in densely interconnected circuits called neural networks. In Los Alamos, researchers are simulating biological neural networks on supercomputers, enabling machines to learn about their surroundings, interpret data and make predictions much the way humans do. This kind of machine learning is easy to grasp in principle, but hard to implement in a computer. Teaching neuromimetic machines to take on huge tasks like predicting weather and simulating nuclear physics is an enterprise requiring the latest in high-performance computing resources. Los Alamos has developed codes that run efficiently on supercomputers with millions of processing cores to crunch vast amounts of data and perform a mind-boggling number of calculations (over 10 quadrillion!) every second. Until recently, however, researchers attempting to simulate neural processing at anything close to the scale and complexity of the brain’s cortical circuits have been stymied by limitations on computer memory and computational power. All that has changed with the new Trinity supercomputer at Los Alamos, which became fully operational in mid-2017. The fastest computer in the United States, Trinity has unique capabilities designed for the National Nuclear Security Administration’s stockpile stewardship mission, which includes highly complex nuclear simulations in the absence of testing nuclear weapons. All this capability means Trinity allows a fundamentally different approach to large-scale cortical simulations, enabling an unprecedented leap in the ability to model neural processing. To test that capability on a limited-scale problem, computer scientists and neuroscientists at Los Alamos created a “sparse prediction machine” that executes a neural network on Trinity. A sparse prediction machine is designed to work like the brain: researchers expose it to data – in this case, thousands of video clips, each depicting a particular object, such as a horse running across a field or a car driving down a road. Cognitive psychologists tell us that by the age of six to nine months, human infants can distinguish objects from background. Apparently, human infants learn about the visual world by training their neural networks on what they see while being toted around by their parents, well before the child can walk or talk. Similarly, the neurons in a sparse prediction machine learn about the visual world simply by watching thousands of video sequences without using any of the associated human-provided labels – a major difference from other machine-learning approaches. A sparse prediction machine is simply exposed to a wide variety of video clips much the way a child accumulates visual experience. When the sparse prediction machine on Trinity was exposed to thousands of eight-frame video sequences, each neuron eventually learned to represent a particular visual pattern. Whereas a human infant can have only a single visual experience at any given moment, the scale of Trinity meant it could train on 400 video clips simultaneously, greatly accelerating the learning process. The sparse prediction machine then uses the representations learned by the individual neurons, while at the same time developing the ability to predict the eighth frame from the preceding seven frames, for example, predicting how a car moves against a static background. The Los Alamos sparse prediction machine consists of two neural networks executed in parallel, one called the Oracle, which can see the future, and the other called the Muggle, which learns to imitate the Oracle’s representations of future video frames it can’t see directly. With Trinity’s power, the Los Alamos team more accurately simulates the way a brain handles information by using only the fewest neurons at any given moment to explain the information at hand. That’s the “sparse” part, and it makes the brain very efficient and very powerful at making inferences about the world – and, hopefully, a computer more efficient and powerful, too. After being trained in this way, the sparse prediction machine was able to create a new video frame that would naturally follow from the previous, real-world video frames. It saw seven video frames and predicted the eighth. In one example, it was able to continue the motion of car against a static background. The computer could imagine the future. This ability to predict video frames based on machine learning is a meaningful achievement in neuromimetic computing, but the field still has a long way to go. As one of the principal scientific grand challenges of this century, understanding the computational capability of the human brain will transform such wide-ranging research and practical applications as weather forecasting and fusion energy research, cancer diagnosis and the advanced numerical simulations that support the stockpile stewardship program in lieu of real-world testing. To support all those efforts, Los Alamos will continue experimenting with sparse prediction machines in neuromorphic computing, learning more about both the brain and computing, along with as-yet undiscovered applications on the wide, largely unexplored frontiers of quantum computing. We can’t predict where that exploration will lead, but like that made-up eighth video frame of the car, it’s bound to be the logical next step. Garrett Kenyon is a computer scientist specializing in neurally inspired computing in the Information Sciences group at Los Alamos National Laboratory, where he studies the brain and models of neural networks on the Lab’s high-performance computers. Other members of the sparse prediction machine project were Boram Yoon of the Applied Computer Science group and Peter Schultz of the New Mexico Consortium. This story first appeared in Discover.
<urn:uuid:c52f475a-8b22-46ab-b0bf-0cf0b4a776bb>
CC-MAIN-2019-26
https://lasciencepresskits.com/machine-learning/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627997501.61/warc/CC-MAIN-20190615222657-20190616004549-00016.warc.gz
en
0.925442
1,372
4.21875
4
Nobody has built a quantum computer much more powerful than a pocket calculator but that hasn’t stopped people worrying about the implications of the post-quantum computing world. Most worried are the people who rely on cryptographic codes to protect sensitive information. When the first decent-sized quantum computer is switched on, previously secure codes such as the commonly used RSA algorithm will become instantly breakable. Which is why cryptographers are scurrying about looking for codes that will be secure in the post-quantum world. Today, Hang Dinh at the University of Connecticut and a couple of pals show that cryptographers have been staring at one all along. They say that a little-used code developed by the CalTech mathematician Robert McEliece in 1978 can resist all known attacks by quantum computers. First, let’s a make a distinction between symmetric and asymmetric codes. Symmetric codes use identical keys for encrypting and decrypting a message. Quantum computers can dramatically speed up an attack against these kinds of codes. However, symmetric codes have some protection. Doubling the size of the key counteracts this speed up. So it is possible for code makers to stay ahead of the breakers, at least in theory. (Although in practice, the safe money would be on the predator in this cat and mouse game. ) Asymmetric codes use different keys for encrypting and decrypting messages. In so-called public key encryption systems such as the popular RSA algorithm, a public key is available to anyone who can use it to encrypt a message. But only those with a private key can decrypt the messages and this, of course, is kept secret. The security of these systems relies on so-called trap door functions: mathematical steps that are easy to make in one direction but hard to do in the other. The most famous example is multiplication. It is easy to multiply two numbers together to get a third but hard to start with the third number and work out which two generated it, a process called factorisation. But in 1994, the mathematician Peter Shor dreamt up a quantum algorithm that could factorise much faster than any classical counterpart. Such an algorithm running on a decent quantum computer could break all known public key encryption systems like a 4-year old running amok in Legoland. Here’s a sense of how it works. The problem of factorisation is to find a number that divides exactly into another. Mathematicians do this using the idea of periodicity: a mathematical object with exactly the right periodicity should divide the number exactly, any others will not. One way to study periodicity in the classical world is to use fourier analysis, which can break down a signal into its component waves. The quantum analogue to this is the quantum fourier sampling and Shor’s triumph was to find a way to use this idea to find the periodicity of the mathematical object that reveals the factors. Thanks to Shor, any code that relies on this kind of asymmetry (ie almost all popular public key encryption systems) can be cracked using a quantum fourier attack. The McEliese cryptosystem is different. It too is asymmetric but its security is based not on factorisation but on a version of a conundrum that mathematicians call the hidden supgroup problem. What Dinh and buddies have shown is that this problem cannot be solved using quantum fourier analysis. In other words it is immune to attack by Shor’s algorithm. In fact, it is immune to any attack based on quantum fourier sampling. That’s a big deal. It means that anything encoded in this way will be safe when the next generation of quantum computers start chomping away at the more conventional public key cryptosystems. One such system is Entropy, a peer-to-peer communications network designed to resist censorship based on the McEliese cryptosystem. But Entropy is little used and there are good reasons why others have resisted the McEliese encryption system. The main problem is that both the public and private keys are somewhat unwieldy: a standard public key is a large matrix described by no fewer than 2^19 bits. That may seem less of a problem now. It’s possible that the McEleise system will suddenly become the focus of much more attention more than 30 years after its invention. However, it’s worth pointing out that while the new work guanratees safety against all known quantum attacks, it does nothing of the sort for future quantum attacks. It’s perfectly possible that somebody will develop a quantum algorithm that will tear it apart as easily as Shor’s can with the RSA algorithm. “Our results do not rule out other quantum (or classical) attacks,” says Dinh and co. So s more likely scenario for future research is that crytpographers will renew their efforts in one of the several other directions that are looking fruitful, such as lattice-based algorithms and multivariate cryptography. Either way, expect to hear a lot more about post quantum cryptography–provided the powers that be allow. Ref: arxiv.org/abs/1008.2390 : The McEliece Cryptosystem Resists Quantum Fourier Sampling Attacks
<urn:uuid:d56f0e04-783b-4742-99af-322e4f0ebc03>
CC-MAIN-2019-26
https://www.technologyreview.com/s/420287/1978-cryptosystem-resists-quantum-attack/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998513.14/warc/CC-MAIN-20190617163111-20190617185111-00261.warc.gz
en
0.944111
1,092
3.828125
4
1. Five Generations of Computers - The number of instructions or the amount of data a computer can store in its memory is measured in bytes. - It is a worldwide system of computer networks – a network of networks in which users at any one computer can get information from any other computer(if they have permission). How does the Internet work? Administration of Internet - Internet Corporation for Assigned Names and Numbers(ICANN), a US non-profit organization administers the allocation of domain names and IP addresses. - Internet Society(ISOC) is responsible for developing internet technical standards. - A computer or an array of computers that act as one collective machine capable of processing enormous amounts of data. - They work at very high speeds and perform complex jobs such as nuclear research or forecasting weather patterns. - It channels all its power into executing a few programs as fast as possible rather than executing many programs concurrently. - It uses parallel processing instead of the serial processing in the case of an ordinary computer Supercomputers in India |1||SahasraT||Indian Institute of Science, Bengaluru| |2||Aaditya||Indian Institute of Tropical Meteorology, Pune| |3||TIFR-Cray XC30||Tata Institute of Fundamental Research, Mumbai| |4||HP Apollo 6000||Indian Institute of Technology, Delhi| |5||PARAM Yuva-2||Centre for Development of Advanced Computing(C-DAC), Pune| |6||PARAM ISHAN||Indian Institute of Technology, Guwahati| Supercomputers of the World 5. Quantum Computing - Quantum computing studies computation systems that make direct use of quantum-mechanical phenomena to perform operations on data. - Classical computers encode information in bits. Each bit can take the value of 1 or 0. These 1s and 0s act as on/off switches that ultimately drive computer functions. Quantum computers, on the other hand, are based on qubits, which operate according to two key principles of quantum physics: superposition and entanglement. - Superposition means that each qubit can represent both a 1 and a 0 at the same time. - Entanglement means that qubits in a superposition can be correlated with each other i.e. the state of one (whether it is a 1 or a 0) can depend on the state of another. 6. Types of Cybercrimes 7. Cloud Computing - It is an Internet-based computing solution where shared resources are provided like electricity distributed on the electrical grid - Computers in the cloud are configured to work together and the various applications use the collective computing power as if they are running on a single system. IT PROJECTS IN INDIA 1. National Supercomputer Mission(NSM) - The Mission envisages empowering our national academic and R&D institutions spread over the country by installing a vast supercomputing grid comprising a cluster of more than 70 high-performance computing facilities - The Mission would be implemented and steered jointly by the Department of Science and Technology (DST) and Department of Electronics and Information Technology (DeitY) at an estimated cost of Rs.4500 crore over a period of seven years. - To make India one of the world leaders in Supercomputing and to enhance India’s capability in solving grand challenge problems of national and global relevance - To empower our scientists and researchers with state-of-the-art supercomputing facilities and enable them to carry out cutting-edge research in their respective domains - To minimize redundancies and duplication of efforts, and optimize investments in supercomputing - To attain global competitiveness and ensure self-reliance in the strategic area of supercomputing technology - Climate Modelling - Weather Prediction - Aerospace Engineering - Computational Biology - Molecular Dynamics - Atomic Energy Simulations - National Security/ Defence Applications - Seismic Analysis - Disaster Simulations and Management - Computational Chemistry - Computational Material Science and Nanomaterials - Discoveries beyond Earth (Astrophysics) - Large Complex Systems Simulations and Cyber Physical Systems - Big Data Analytics - Information repositories/ Government Information Systems 2. National e-Governance Plan - An initiative of the Government of India to make all Government services available to the citizens of India via electronic media - It was formulated by the Department of Electronics and Information Technology (DeitY) and Department of Administrative Reforms & Public Grievances (DAR&PG) to reduce government costs and allow citizen access to government services through Common Service Centres (CSC). - It comprises of 27 Mission Mode Projects(MMP) and 10 program support components. 3. e-Kranti/National e-Governance Plan 2.0 - It is an important pillar of the Digital India programme. - The vision of e-Kranti is “Transforming e-Governance for Transforming Governance”. - The Mission of e-Kranti is to ensure a Government wide transformation by delivering all Government services electronically to citizens through integrated and interoperable systems via multiple modes, while ensuring efficiency, transparency and reliability of such services at affordable costs. 4. National Knowledge Network(NKN) - It aims to bridge the gap between rural education, urban education, and International education by interconnecting all universities, government as well as private institutions of higher learning and research with a high-speed data communication network in the country.
<urn:uuid:3d93cc79-87d8-4a36-bf73-bb6382e6d94b>
CC-MAIN-2019-26
https://www.civilsdaily.com/computers-supercomputers-quantum-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998084.36/warc/CC-MAIN-20190616082703-20190616104703-00222.warc.gz
en
0.86218
1,159
3.734375
4
Special Relativity. It’s been the bane of space explorers, futurists and science fiction authors since Albert Einstein first proposed it in 1905. For those of us who dream of humans one-day becoming an interstellar species, this scientific fact is like a wet blanket. Luckily, there are a few theoretical concepts that have been proposed that indicate that Faster-Than-Light (FTL) travel might still be possible someday. A popular example is the idea of a wormhole: a speculative structure that links two distant points in space time that would enable interstellar space travel. Recently, a team of Ivy League scientists conducted a study that indicated how “traversable wormholes” could actually be a reality. The bad news is that their results indicate that these wormholes aren’t exactly shortcuts, and could be the cosmic equivalent of “taking the long way”! Originally, the theory of wormholes was proposed as a possible solution to the field equations of Einstein’s Theory of General Relativity (GR). Shortly after Einstein published the theory in 1915, German physicists Karl Schwarzschild found a possible solution that not only predicted the existence of black holes, but of corridors connecting them. Unfortunately, Schwarzschild found that any wormhole connecting two black holes would collapse too quickly for anything to cross from one end to the other. The only way they could be traversable would be if they were stabilized by the existence of exotic matter with negative energy density. Daniel Jafferis, the Thomas D. Cabot Associate Professor of Physics at Harvard University, had a different take. As he described his analysis during the 2019 April meeting of the American Physical Society in Denver, Colorado: “The prospect of traversable wormhole configurations has long been a source of fascination. I will describe the first examples that are consistent in a UV completable theory of gravity, involving no exotic matter. The configuration involves a direct connection between the two ends of the wormhole. I will also discuss its implications for quantum information in gravity, the black hole information paradox, and its relation to quantum teleportation.” For the purposes of this study, Jafferis examined the work performed by Einstein and Nathan Rosen in 1935. Looking to expand upon the work of Schwarszchild and other scientists seeking solutions to GR, they proposed the possible existence of “bridges” between two distant points in space time (known as “Einstein–Rosen bridges” or “wormholes”) that could theoretically allow for matter and objects to pass between them. By 2013, this theory was used by theoretical physicists Leonard Susskind and Juan Maldacena as a possible resolution for GR and “quantum entanglement“. Known as the ER=EPR conjecture, this theory suggests that wormholes are why an elementary particles state can become entangled with that of a partner, even if they are separated by billions of light years. It was from here that Jafferis developed his theory, postulating that wormholes could actually be traversed by light particles (aka. photons). To test this, Jafferis conducted an analysis with the assistance with Ping Gao and Aron Wall (a Harvard graduate student and Stanford University research scientist, respectively). What they found was that while it is theoretically possible fir light to traverse a wormhole, they are not exactly the cosmic shortcut we were all hoping for them to be. As Jafferis explained in an AIP press statement, “It takes longer to get through these wormholes than to go directly, so they are not very useful for space travel.” Basically, the results of their analysis showed that a direct connection between black holes is shorter than that of a wormhole connection. While this certainly sounds like bad news to people who are excited by the prospect of interstellar (and intergalactic) travel someday, the good news is that this theory provides some new insight into the realm of quantum mechanics. “The real import of this work is in its relation to the black hole information problem and the connections between gravity and quantum mechanics,” said Jafferis. The “problem” he refers to is known as the Black Hole Information Paradox, something that astrophysicists have been struggling with since 1975, when Stephen Hawking discovered that black holes have a temperature and slowly leak radiation (aka. Hawking radiation). This paradox relates to how black holes are able to preserve any information that passes into them. Even though any matter accreted onto their surface would compressed to the point of singularity, the matter’s quantum state at the time of its compression would be preserved thanks to time dilation (it becomes frozen in time). But if black holes lose mass in the form of radiation and eventually evaporate, this information will eventually be lost. By developing a theory through which light can travel through a black hole, this study could represent a means of resolving this paradox. Rather than radiation from black holes representing a loss of mass-energy, it could be that Hawking Radiation is actually coming from another region of space time. It may also help scientists who are attempting to develop a theory that unifies gravity with quantum mechanics (aka. quantum gravity, or a “Theory of Everything”). This is due to the fact that Jafferis used quantum field theory tools to postulate the existence of traversable black holes, thus doing away with the need for exotic particles and negative mass (which appear inconsistent with quantum gravity). As Jafferis explained: “It gives a causal probe of regions that would otherwise have been behind a horizon, a window to the experience of an observer inside a spacetime, that is accessible from the outside. I think it will teach us deep things about the gauge/gravity correspondence, quantum gravity, and even perhaps a new way to formulate quantum mechanics.” As always, breakthroughs in theoretical physics can be a two-edged sword, giving with one hand and taking away with the other. So while this study may have thrown more cold water on the dream of FTL travel, it could very well help us unlock some of the Universe’s deeper mysteries. Who knows? Maybe some of that knowledge will allow us to find a way around this stumbling block known as Special Relativity!
<urn:uuid:19cc6e2d-1058-4a35-8efb-99a18deae88a>
CC-MAIN-2019-26
https://restlesssheep.com/space/you-could-travel-through-a-wormhole-but-its-slower-than-space-say-scientists/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627998913.66/warc/CC-MAIN-20190619043625-20190619065625-00185.warc.gz
en
0.951757
1,305
3.640625
4
Blockchain cryptography is at the very heart of what keeps cryptocurrencies and other digital assets safe from hackers and other cyber-attacks. Public key encryption provides each user with a public and private key, which are extremely difficult to guess through brute-force attacks, at least using today’s computing resources. However, developments in quantum computing will make brute-force attacks far easier in the future. Here, we will take an in-depth look at how a quantum computer could successfully attack existing blockchain cryptography. Considering some projects are already making headway, we’ll also look at how blockchains can be secured against quantum machines. How Can a Quantum Computer Break Blockchain Cryptography? Blockchain uses public key encryption, where each user is given a public and private key to secure their digital assets. These keys are generated using a cryptographic method called prime number factorization, which is the backbone of all modern cryptography. The mathematical principle behind prime number factorization is that any number, no matter how large, can be produced by multiplying prime numbers. It’s relatively easy to produce any number using prime numbers. However, it’s vastly more difficult to reverse the process and work out which prime numbers were multiplied to produce a particular value once the numbers become large. This reversal is called prime number factorization. Key Encryption and Prime Number Factorization Blockchain cryptography relies on prime number factorization for linking the public and private key. The prime number factors of the public key are what form the private key. Because today’s computers, even using the advantages of networks, cannot factor the private key, our digital assets can remain secure against attackers. For example, in 2009, researchers used a network of computers to try and factor a number 232 digits long. It took the equivalent of 2,000 years for a single computer launching such an attack. Computer security specialists nevertheless thought this was an unacceptable risk. Thus, current encryption standards use prime numbers that are 309 digits long. Quantum computers are capable of performing many more thousands of calculations per second than today’s computers, even accounting for the network effect. Quantum machines are still in a relatively early stage of development. However, its thought that over the next decade, quantum computers will become sufficiently powerful to break existing blockchain cryptography. Therefore, one of the challenges for the blockchain developer community is ensuring that existing blockchains are resilient enough to withstand attacks from tomorrows quantum computers. The Specific Threat of Quantum Computing to Blockchain Because all current cybersecurity relies on encryption using prime number factorization, the advent of quantum computing isn’t just a threat to blockchain encryption. It has implications across the whole of the internet and all connected computers. However, centralized entities control pretty much all websites and networks outside of blockchain. Therefore, it isn’t a significant problem to implement an upgrade across the network or website. On the other hand, decentralized networks control blockchains. Decentralization means that every computer on the network has to agree to upgrade at the same time for the upgrade to become active. Not only that but because the quantum threat to blockchain cryptography is specific to the public and private keys, then all wallets will need to upgrade to the new software to ensure quantum resistance. The Worst Bear Market in Future History? Satoshi Nakamoto is thought to own around a million Bitcoins, not to mention his fortunes from the many Bitcoin hard forks over the years. If the Bitcoin network pushes through an upgrade to ensure quantum resistance, and Satoshi doesn’t upgrade his BTC wallets to the new protocol, his wallets become vulnerable to the quantum threat. So, even if all other holders of BTC upgrade their wallets, a quantum attack could still see Satoshi’s one million BTC stolen and sold off onto the market in one fell swoop. Even worse though, it’s not just whales that are at risk. After all, anyone consciously sitting on crypto-wealth will be eager to upgrade as soon as possible. However, it’s thought that around four million BTC are lost due to their users losing their private keys. Someone stealing and then selling this volume of crypto in a short space of time could have a devastating effect on the markets. Therefore, developing quantum resistant blockchain cryptography is not necessarily the problem. The implementation across thousands or even millions of wallets becomes the real challenge. Securing Blockchain Cryptography against the Quantum Threat Most people still think the quantum threat is several years away, perhaps even more than a decade. However, the above scenario illustrates why it’s vital that developments in blockchain cryptography already start to consider quantum resistance as a precautionary measure. One-time Signatures with Cryptographic Hashing Quantum Resistant Ledger (QRL) is not one of the biggest blockchain projects out there. However, its sole use case is in ensuring quantum resistant blockchain cryptography. The project works from a principle that prediction timelines about advancements in quantum technology may be fallible. For this reason, we should already start preparing for the eventuality that quantum developments may arrive sooner than we think. The QRL blockchain completely does away with prime number factorization for blockchain cryptography. Instead, it makes use of Extended Merkle Signature Schemes (XMSS), which is a complex model. However, in principle, it involves generating key pairs using cryptographic hashing. This is the same concept as hashing a block in a blockchain to protect the contents. These key pairs are for one-time use only and are aggregated together using a Merkle tree. By using hash-based blockchain cryptography rather than prime number factorization, the signatures become far more complicated to brute-force. This hashing makes them more resistant to quantum attacks. The Nexus blockchain uses a similar mechanism when handling transactions, called signature chains. Nexus hashes the public key so although it’s visible on the blockchain, it isn’t readable. The public key hash then generates a one-time private key as an authorization signature for the transaction. Afterwards, the wallet automatically creates a new public/private key pair for the next transaction, along with a sending or receiving address for the current transaction. In this way, the transaction keys are separate from the address, making it more secure against quantum attacks. Despite that the threat may still be some way off, blockchain cryptography faces some unique challenges from quantum computing. The developer community is very much aware of the threat. Hence, the introduction of innovative solutions such as the ones listed here by QRL and Nexus. Implementing these kinds of solutions will prove to the most challenging part of quantum-proofing the major blockchains such as Bitcoin and Ethereum. However, the blockchain developer community is nothing if not creative. It will be fascinating to see some of the ideas coming out of the quantum challenge. Furthermore, find out which of those ideas will ultimately evolve into the most resilient solutions. Featured image courtesy of Pixabay The post How Blockchain Cryptography is Fighting the Rise of Quantum Machines appeared first on CoinCentral. Coincentral.com is author of this content, TheBitcoinNews.com is is not responsible for the content of external sites. TheBitcoinNews.com is here for you 24/7 to keep you informed on everything crypto. Like what we do? Tip us some Satoshi with the exciting new Lightning Network Tippin.me tool!
<urn:uuid:f861e7af-fc5d-4e78-a054-bb1ef00b0e3b>
CC-MAIN-2019-26
https://thebitcoinnews.com/how-blockchain-cryptography-is-fighting-the-rise-of-quantum-machines/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999040.56/warc/CC-MAIN-20190619184037-20190619210037-00266.warc.gz
en
0.932036
1,500
3.671875
4
What is Artificial Leaf? - The artificial leaf is a silicon-based technology which produces hydrogen energy (a clean fuel) by utilizing sunlight to split water molecules into hydrogen and oxygen. - It is designed to mimic photosynthesis, the natural mechanism by which plants utilize sunlight to produce carbohydrates and store energy. But Artificial Leaf is designed to be far more efficient in photosynthesis than plants. Hence it is like a photosynthesis machine. - The invention of Artificial Leaf is credited to an American Chemist named Daniel G. Nocera and his colleagues in 2011. - It is an amalgamation of physics, engineering, and biology. - It is a clean way of producing energy that will someday become the main weapon in the fight against climate change and also as an important power source, particularly in developing nations like India. How is it different from the natural leaf? A plant utilizes just 1% of the energy it gets from the sun for converting CO2 and water into carbohydrates or glucose, whereas the artificial leaf can utilize 20% of the energy it receives from the sun to produce Hydrogen. What is the need for Artificial Leaf? - The primary application of artificial leaf is the production of clean energy that is Hydrogen. Hydrogen is being used in a variety of different sustainable technologies. But conventional techniques of capturing Hydrogen such as steam reforming and hydraulic fracturing (or fracking) tend to release potentially harmful chemicals into the environment which is not desirable for sustainable development. - This could solve the major challenge of solar and wind power plants in producing and storing energy when the sun is not shining and the air is still and also the associated costs. - Around 3 billion people are living in regions that do not have any access to electricity. Hence, there is a need for a simple device like Artificial Leaf that is compatible with local conditions. - The artificial leaf is highly relevant for countries like India as it is blessed with immense sunlight throughout the year but not adequately translated into energy. How does it work? - Artificial Leaf system consists of semiconductors piled up in a way to simulate the natural leaf system. When sunlight strikes the semiconductors, electrons transfer in one direction, generating electric current. The current instantaneously splits water into hydrogen and oxygen. - The resultant hydrogen gas can then be used for the immediate production of electricity or can be stored for later usage. - Semiconductors are coated in chemical catalysts such as platinum to speed up the water-splitting reaction. - The main by-product of this process is water, which is why researchers believe that artificial leaf is the cleanest source of energy generation. What are the applications of Artificial Leaf? - It has the potential to transform the transportation sector in a big way by making even the long-distance air travel affordable and environmentally sustainable. It makes the way for the eco-friendly cars to become a common mode of transport in the future since artificial leafs can produce liquid hydrocarbon fuels that could be burned in modern car engines without any major alterations in design or technology. - Artificial leaf makes hydrogen a renewable source of energy as sunlight and water are abundant on Earth. - With artificial leaf, people can produce their own electricity and can live far away from the electricity grid, as hydrogen energy can be produced anywhere at any time. It has to be noted that, an estimated one to three bottles of water is enough to power a single household in less-developed regions of the world. - Researchers also claim that, by using the resultant hydrogen from artificial leaf we can also produce various products such as fertilizers, plastics and drugs by means of using microbes or bacterium. - It helps mitigate global warming and climate change by reducing carbon footprint, since it absorbs CO2 from the air to generate hydrogen and releasing more oxygen in the process. It is also 100% more efficient in absorbing CO2 from air than natural plant leaf. - Artifical leaves also help in making the thermal power plants more efficient, because thermal plants produce more CO2 which can be turned into energy to further fuel the power plant. In short, it removes CO2 and at the same time the leaf energy power up the thermal power stations. - It provides environmental friendly and affordable storage of energy when compared to other renewables such as solar and wind power plants. What are the challenges in the commercialization of Artificial Leaf? - Efficiency and cost-effectiveness are the two important factors that affect the commercialization process. In initial research, artificial leaf captured only 4.7% of solar energy for producing hydrogen. Artificial leaf technology remains potentially expensive due to lack of cheap and abundant materials in the production process. - Concerns about the safety of hydrogen fuel storage also restrict the practical implementation. What are the opportunities for the successful commercialization? - Devices developed since the initial research achieved efficiencies as higher as 10%. - Researchers are also working on the cheaper version of catalysts and processes to make the technology cheaper for the large-scale production. Don't Miss Out Any Post! What is the recent Indian research on the Artificial leaf about? - In 2017, Council of Scientific and Industrial Research (CSIR) scientists of India developed an artificial leaf device with improved efficiency. They used gold nanoparticles, titanium dioxide, and quantum dots to make the process efficient. - Quantum dots are semiconductor nanocrystals with distinct properties that depend on the size of the dots. It does not need any external voltage and performs better than existing solar cells. Apart from solar cells quantum dots also have applications in transistors, LEDs, medical imaging and quantum computing. We are already experiencing the harmful effects of global warming such as changing climate and rising water levels to such a degree that would only become severe if we keep on adding more CO2 into the atmosphere.The artificial leaf is a big leap forward in producing environmentally sustainable energy thereby mitigating the harmful effects of climate change as it absorbs the CO2 from the air directly to produce energy. Hence policymakers should actively donate researchers with funds and make relevant policies in order to achieve the desired scale in the artificial leaf production. About the Author Latest posts by Santhosh Kumar (see all) - [Premium] RTE (Amendment) Act, 2019: No-Detention Controversy - June 18, 2019 - [Updated] Draft National Education Policy 2019 – Three Language Controversy - June 17, 2019 - [Updated] [Premium] India-Maldives Relations: Complete Analysis - June 14, 2019
<urn:uuid:64da204b-7346-46b3-a6ea-bab87465a91e>
CC-MAIN-2019-26
https://iasexpress.net/artificial-leaf/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999040.56/warc/CC-MAIN-20190619184037-20190619210037-00265.warc.gz
en
0.934534
1,334
3.921875
4
(April 3, 2019) -- Building on the Air Force’s need to develop tech devices that require minimal charging in the field, the University of Texas at San Antonio (UTSA) is using principles in quantum science and engineering to build a graphene-based logic device. This new technology will improve the energy efficiency of battery-dependent devices from cell phones to computers. “We are developing devices that can operate almost battery-less,” said Ethan Ahn, UTSA assistant professor in electrical engineering. UTSA engineers are using spintronics, the study of an electron’s intrinsic quantum mechanical property called spin, to allow low-power operation with a possible application in quantum computing. “An electron is a little, but very strong magnet,” said Ahn. “Just imagine that an electron spins on its own axis, either up or down.” Traditional tech devices use the electronic charge of electrons for power. In spintronics, researchers are tapping the inherent spin of electrons as a new power source. With this new approach, devices will require fewer electrons to operate. There are hurdles, however, in harnessing the power of spin. In quantum computing that harnesses spin of electrons to transmit information, the challenge for researchers is how to capture spin as efficiently as possible. “If you have 100 electrons injected to the channel to power the next logic circuit, you may only get to use one or two spins because the injection efficiency is very low. This is 98 percent spin lost,” said Ahn. To prevent the loss of spin, Ahn has developed the new idea of the “zero-power carbon interconnect” by using nanomaterials as both the spin transport channel and the tunnel barrier. These nanomaterials are like a sheet of paper, a two-dimensional layer of carbon atoms just a few nanometers in thickness, and it’s the point of contact where spin injection is inputted into the device. Ahn’s prototype is an interconnect built with a reduced graphene oxide layer. “It’s novel because we are using graphene, a nanomaterial, to enhance spin injection. By controlling the amount of oxide on the graphene layers, we can fine tune electrons’ conductivity,” said Ahn. Graphene has widespread appeal because it’s the world's strongest nanomaterial. In fact, the room temperature conductivity of graphene is higher than that of any other known material. If successful, the zero-power carbon interconnect that Ahn is creating with his collaborators at UT-Austin and Michigan State University would be integrated into the logic component of a computer chip. The device, once developed, will be submitted to the U.S. Air Force Office of Scientific Research, which is supporting UTSA’s work with a three-year grant. “The military needs smaller devices that can operate in remote fields without need to recharge batteries,” said Ahn. “If our zero-power carbon interconnect is successful, it will improve the efficiency of graphene spintronics — a crucial step in advancing the next generation of low-power electronics like quantum computing.” This interconnect could also be highly beneficial to the cloud computing industry. According to the Data Knowledge Center, on-demand cloud computing platforms such as Amazon Web Services alone consume about two percent of the nation’s energy. If the zero-power carbon interconnect is successful, cloud servers such as those that offer streaming services like Netflix or host data, could operate faster and with less electricity. Learn more about the UTSA Nano Lab. Learn more about the UTSA Department of Electrical and Computer Engineering. Celebrate UTSA’s 50th Anniversary and share social media posts about the 50th using the hashtag #UTSA50. UTSA will offer science, engineering, architecture, sports, music, writing and language and culture camps for kids, teens and adults. Register now.Various locations, Main and Downtown Campuses The weeklong academy will provide educators with resources, literature, information, and best practices on bringing Mexican American Studies into their classrooms. The materials provided are aligned to the social studies Texas Essential Knowledge and Skills objectives for middle and high school.UTSA Institute of Texan Cultures, Hemisfair Campus Lecture presented by Dr. Nelson Flores, associate professor at the University of Pennsylvania. Free and open to the public.Durango Building, La Villita Room (DBB 1.116), Downtown Campus Are you transferring to UTSA? Take this opportunity to drop off any documents and complete your application for admission to UTSA! During this time you will have the opportunity to drive up, meet a staff member and submit your transcripts.Frio Street Building Breezeway, Downtown Campus Are you transferring to UTSA? Take this opportunity to drop off any documents and complete your application for admission to UTSA! During this time you will have the opportunity to drive up, meet a staff member and submit your transcripts.Welcome Center Breezeway, Bauerle Rd. Garage, Main Campus Future Roadrunners and families prepare for everything they need to know before the fall semester.Various locations, Main and Downtown Campuses Willie Hale, UTSA assistant professor of psychology, will discuss how advances in statistical modeling can help us understand the ways in which a variety of risk and protective factors influence the onset, maintenance, and treatment of PTSD.Blue Star Contemporary, 116 Blue Star, San Antonio Presented by Dr. Marcia Farr, Professor Emerita, The Ohio State University. Free and open to the public.Durango Building, El Mercado Room (DBB 1.208), Downtown Campue The University of Texas at San Antonio is dedicated to the advancement of knowledge through research and discovery, teaching and learning, community engagement and public service. As an institution of access and excellence, UTSA embraces multicultural traditions and serves as a center for intellectual and creative resources as well as a catalyst for socioeconomic development and the commercialization of intellectual property - for Texas, the nation and the world. To be a premier public research university, providing access to educational excellence and preparing citizen leaders for the global environment. We encourage an environment of dialogue and discovery, where integrity, excellence, inclusiveness, respect, collaboration and innovation are fostered.
<urn:uuid:2e42f1f8-189b-4741-9f1f-71f6bcb0ec80>
CC-MAIN-2019-26
https://www.utsa.edu/today/2019/04/story/Spintronics.html
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999200.89/warc/CC-MAIN-20190620085246-20190620111246-00348.warc.gz
en
0.92239
1,316
3.59375
4
Quantum Science Satellite (QSS) China’s Quantum Science Satellite, nicknamed Micius, is the world’s first satellite mission testing quantum communications technology which is likely to become the cornerstone of uncrackable communications systems of the future. The Quantum Science Satellite (QSS) provides the first space-based platform with long-distance satellite and ground quantum channel, carrying out a series of tests to examine fundamental quantum principles and communications protocols in a full-sized space-to-ground architecture. Completing a two-year mission from a 600-Kilometer orbit, QSS will test long-range quantum communications to evaluate the technology readiness level for a Global Scale Quantum Communications Network. QSS is a project of the China Academy of Sciences with participation of the Austrian Academy of Sciences for a total project value of around $100 million. Space-based quantum communications represent a kind of modern-day space race given the technology will uncover any tinkering and eavesdropping in the exchange of information between two parties – making it attractive for national security needs and intelligence agencies. A number of projects are currently being worked on by teams in China, Canada, Japan and Singapore and some progress was made in the field of exchanging information via entangled photons. The U.S. is likely developing quantum communications technology as part of classified national defence projects. Quantum communications between space and Earth are accomplished by establishing a long-distance quantum channel. Data sent between the transmitting and receiving stations can not be copied, stolen or spied on – illustrating why quantum communications have been identified as a criticality in times when attacks on sensitive information have become a threat to the world’s governments and private endeavors. Quantum entanglement is a physical phenomenon in which pairs or groups of particles interact in ways such that the quantum state can only be described for the system as a whole and not for the individual particles part of that system. The measurement of a property of a particle can be seen as acting on that particle and will change its original quantum property which – in case of entangled particles – will cause the system’s quantum state to change. It appears that one particle of an entangled pair “knows” the measurement conducted on the other particle in an exchange of information that can cover any distance. Entanglement is one of the most counter-intuitive features of quantum mechanics because the perfect correlations between entangled systems are in direct conflict with the concepts of classical physics. There are proposed theories that predict quantum entanglement is limited to certain scales concerning mass and length or can be altered in certain gravitational environments. To exploit quantum mechanics for communications, the validity of these theories has to be investigated beyond distances and velocities achievable in ground-based experimentation and in environments were effects of quantum physics and relativity begin to interplay to reveal quantum interference effects that could occur with distances of thousands of Kilometers and speeds closer to relativistic velocities. Quantum encryption can take advantage of quantum entanglement, utilizing it to detect any eavesdroppers entering the communications loop as their presence would automatically cause quantum states to collapse, ending the flow of information and revealing their spying to the operators of the system. The inherit complexity of quantum mechanics makes it impossible to reverse engineer the quantum key encoded in the polarization of a string of photons, providing an ultra-secure means of communication. Classical cryptography does not allow a provable, unconditionally secure key to be generated simultaneously by two separate communicating parties. While the exchange of information through quantum entanglement is perfectly secure, quantum networks are still vulnerable to denial of service attacks, physical tampering with hardware, and typical issues arising with operational security. The primary goal of the QSS mission is to implement a series of communications experiments between the Quantum Science Satellite and quantum communications stations on the ground. The mission will aim to set up an ultra-long-range quantum channel between the ground and satellite, implementing a quantum key distribution for secure quantum experiments. Another experiment will use the satellite as a repeater to connect two quantum ground stations on Earth. QSS will test quantum entanglement over large distances, distributing quantum entangled photons from the satellite to a pair of distant ground stations. This will also put the principles of the non-locality of quantum mechanics to the test. The QSS mission is hoped to demonstrate Quantum Teleportation, a fundamental process needed in quantum communications and quantum computing. This experiment will use a high-quality quantum entanglement source on the ground to achieve ground-to-satellite teleportation. The QSS satellite is a 600-Kilogram spacecraft outfitted with a quantum key communicator, a quantum entanglement emitter, quantum entanglement source, quantum experiment controller and processor, and high-speed coherent laser communicator. Ground-based quantum communications using optical fibers have been demonstrated over distances of a few hundred Kilometers. Photons traveling through optical fibers or Earth’s air-filled atmosphere are scattered and absorbed – requiring amplifiers to extend a signal’s reach which is extremely difficult while maintaining a photon’s quantum state. Transmitting signals through the vacuum of space should enable communications over much greater distances. The basic working principle of QSS revolves around a crystal that generates pairs of entangled photons whose properties remain coupled to one another however far apart they are. A high-fidelity optical communications system is then responsible for delivering the partners of the entangled pairs to optical ground stations in Vienna, Austria and Beijing, China where their polarization properties will be used to generate a secret encryption key. QSS is really the first mission to put to the test whether entanglement can indeed exist between particles separated by very large distances as the laws of quantum mechanics stipulate. Teleportation of quantum states will be attempted by using entangled photons plus data on their quantum states to reconstruct the photons in an identical quantum state in a new location. The QSS spacecraft is based on a microsatellite bus that can host payloads of around 200 Kilograms, providing a stable platform with precise pointing capability for ground stations to lock onto the optical carriers from the satellite and vice versa. The basic design work of QSS was finished by the end of 2011, allowing the project to head into mission definition and technology research in 2012. By mid-2013, the first prototypes of QSS payloads were completed and underwent electronics characteristic testing. A ground test model of the satellite structure was finished by October 2013 for mechanical environment simulations followed by thermal testing. The joint payload systems headed into testing in 2014 before the flight units were manufactured for integration on the satellite bus. Because quantum communications are a sensitive new technology, information on the design of the QSS payload are not available. The two-year primary mission is expected to deliver the data needed for an assessment of the feasibility of establishing a constellation of up to 20 satellites to generate a quantum communications network with global space-to-ground communications coverage – a major step towards establishing a quantum internet. The quantum internet, or a quantum computing cloud, would likely consist of a combination of a satellite and terrestrial network. However, a system of this type requires entangled photons to be created by different sources and inter-satellite quantum communications that are still in the more distant future. Also, data rates – currently expected to be in the megabit range – will have to be boosted to several gigabits per second to compete with traditional space-to-ground data links and optical communications using lasers. QSS, if successful, may become a trigger for other nations to invest in the development of quantum communications systems for the government and private sector.
<urn:uuid:98c7a0d5-5757-4da6-b33f-304aeabd3544>
CC-MAIN-2019-26
http://spaceflight101.com/spacecraft/quantum-science-satellite/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560627999000.76/warc/CC-MAIN-20190619143832-20190619165832-00389.warc.gz
en
0.919597
1,552
3.609375
4
The first Wiki was known as WikiWikiWeb and was started in 1994. Wikipedia was launched in 2001. During this period, the technology gestated, waiting for just the right factors to make it relevant and widespread. There are a number of technologies that we come across, that are overhyped, and promise too much on what they can deliver, while still in the early phases of deployment. This deals with the future, and we could get stunning breakthroughs. As with the other articles in this issue, take everything below with a pinch of salt. Quantum computing promises to speed up computing power, because of the various quantum states available, instead of just simple binary. While the term may be common, we are very far away from having quantum computers everywhere. We don’t even know what the quantum bit, or the qubit will be made up of, atoms, electrons, photons or some other subatomic particle. The materials needed to build the circuits and measure these particles are also currently under development. The few quantum computers that exist today are run in lab environments, at near sub zero temperatures. They have a high error rate, and they cannot process the number of qubits yet that would take their capabilities well beyond the scope of conventional computers. Then, there is the question of creating standards and protocols for quantum computers. A lot of progress will be done over the next decade, but don’t expect a consumer grade quantum computer. Even if a consumer ready quantum computer is invented tomorrow, there remains the question of roll out, and even ten years are not enough to make the transition from one computing paradigm to another. SpaceX Mars base The first Red Dragon mission to Mars is expected to take off a little after 2018. This is a pathfinding mission with the specific purpose of finding an ideal place to start building a Mars colony. After this, the first cargo missions to Mars on the Big Falcon Rocket is expected to start off in 2022. So far so good, but this will only happen if the new rocket is developed, successfully tested, and works as expected. SpaceX has not even sent a probe to the red planet yet. The first mission will be called the Heart of Gold, and will deliver a propellant plant to the planet. SpaceX has not yet outlined how it plans to mine the resources as raw material for this fuel planet. It will require carbon dioxide and water locally. It is beyond this stage that things get increasingly improbable. SpaceX is planning a manned mission to Mars with a crew of 12 to make sure the propellant plant is running properly. The earliest pioneering missions by NASA is also beyond the 2030 horizon. We are not saying the SpaceX missions will not happen, but expect some delays. Artificial General Intelligence There is no question about it, Artificial Intelligence will own the next decade. According to Sundar Pichai, the next decade belongs to AI. In the mid 80s, desktop computers changed everything, in the mid 90s, it was the internet, and in the mid 2000s, we saw the advent of smartphones. It is the mid 2010s now, and according to Google, this will totally be AI. IBM, Microsoft, Baidu and other tech majors are banking on AI as well. Artificial general intelligence, is however the point at which an AI can do the entire range of tasks that a human can. The problem here is not a specific task, which AI is already better at in many areas. The challenge here is a single AI being able to understand, adapt, and provide a superior response to a human, every single time, when faced with a wide variety of tasks. This kind of AI is called “strong AI” or “full AI”. The incipient field of neuromorphic computing will have to mature first, which are computers built in the same way that human brains are. We still do not have the raw computing power that artificial general intelligences will require. The machines may take our jobs, but they will not be replacing everything that we can do, just yet. Cross species communication Species already communicate between themselves. This is about technological interventions to enable cross species communication. While there are already technological solutions to better understand what pets want, or make more accurate guesses, we are a far way away from true cross species communication. This is because, our brains are all wired differently, and our understanding of the world is considerably different. For example, dogs depend far more on the sense of smell than humans do. There is evidence to suggest that pets can understand human body language better than other humans. It would be hilarious, and useful to directly understand what your pet is thinking when it is introduced to novel environments and people. To realise cross species communication, we need to get neural implants that understand all the signals in the brain. Then we need machine learning algorithms to translate the intentions and desires of one species, and make them comprehensible to another species. This is just to communicate with humans. Getting your pet parrot to communicate with your pet cat in a meaningful manner, is nowhere close to realisation over the course of the next decade. A cross species communication system is a far cry. 3D printing, or additive manufacturing has been around for a long time. The first systems were actually created in the 1980s. Even after almost 40 years, it is nowhere close to becoming a consumer grade technology. This is because the end products are not very useful or durable. It also takes a long time to create stuff, and it is expensive as well. Additive manufacturing will continue to grow, but is more likely to be used by services that cater to very specific markets. This could be a 3D printing studio for sundry tasks for artists, architects, and cosplayers. It could also be garages that produce custom spare parts for cars. The actual 3D printers are themselves prohibitively expensive, and do not really offer value for money in terms of the types of items they can produce. The techniques for additive manufacturing will continue to improve, and we see some implementation on the production line. But consumer level 3D printing has not happened for the last 40 years, and we don’t see why it should happen in the next 10. Multipurpose Household Robots The challenges with building multipurpose household robots is everything facing artificial general intelligence, plus the robotic implementation. For example, it is very simple to make a robot drive a car, open a door, walk a gangway, or turn a valve. In fact, this can be done just by having robotic cars, doors, and valves. The problem is a single robot that can perform all of these tasks. That would require a roughly human form factor, which robots are very bad at. These robots would be required to use the same appliances and devices that are designed to be used by humans. An android robot cannot even open a door or smash an axe through one, without toppling over. Boston Dynamics is one of the leading companies in this field. The thing is, we might not even need a general purpose household automaton, such as Rosie the Robot in Jetsons. Household automation, IoT, smart speakers can together accomplish separately, what a household robot could possibly do. Forget not happening over the next decade, this one we are chalking down to not even necessary. Of all the things in the list, this is actually the most likely to come true. Both Amazon and UPS are testing out products that use drones to deliver parcels. The problem is scaling up the service, to build a worldwide network. There are many different variables that autonomous drones will have to handle, and they might not be ready to tackle all these situations within the next ten years. There are limitations on the weight and the distance for which a drone can make a delivery. Landing and taking off is a tricky process, and companies are exploring options such as using parachutes and specially marked drop zones. To allow drone delivery at scale, they will need to be proven safe enough, and pass a number of regulatory hurdles. The on board cameras and sensors on board, for example, will have to prove that they do not invade the privacy of the people. Then there is the question of security, when dispatching high value items such as mobile phones. While drone deliveries will certainly implemented in certain pockets, we don’t see a global implementation within the next decade.
<urn:uuid:c67b9547-2399-45fa-bd31-8168f1c61017>
CC-MAIN-2019-26
https://geek.digit.in/2018/08/the-tech-that-wont-be-widespread-over-the-next-decade/
s3://commoncrawl/crawl-data/CC-MAIN-2019-26/segments/1560628000175.78/warc/CC-MAIN-20190626053719-20190626075719-00549.warc.gz
en
0.953194
1,706
3.5
4
To understand the function of nanoscale materials we need to dip our toes into a discussion of quantum physics. One of the key precepts of quantum physics is that particles and energy waves called photons (like light) can only have distinct amounts of energy, as opposed to having any value of energy along a continuum. The differences between the energy states are called quanta. Quantum physics also tells us that fundamental particles, such as electrons or quarks, have a spin state, described as up or down. Spinning particles exist in pairs, one with up spin and one with down. It’s as though the nanoscale world is digital rather than analog. The most spectacular, everyday manifestation of quantum physics is the rainbow. The hot hydrogen and helium atoms in the sun give off photons of light of precise and specific energy values characteristic of their atomic structure. When refracted by raindrops we see these photons of light arrayed in the colors of the rainbow. Other aspects of quantum physics are much harder to both understand and explain. Perhaps the most exotic is quantum entanglement, which stumped even Albert Einstein. Quantum entanglement works as follows: let electrons A and B be a spin pair with A spinning up and B spinning down. If energy is applied to A to flip its spin to down, B will immediately flip its spin to up. The amazing part of this process is that B’s spin will flip sooner than the time it would take for light to travel from A to B. This was at odds with Einstein’s view that no information or energy in the universe could travel faster than the speed of light. The mechanism responsible for quantum entanglement remains a mystery to this day. The appeal of nanotechnology stems from the fact that the physical and chemical properties of materials less than 100 nanometers exhibit behavior based on quantum physics that are not apparent in larger objects. For example, at the nanometer scale copper turns from opaque to transparent, aluminum becomes flammable (even explosive), and carbon atoms can be assembled into nanotubes which are 100 times stronger than steel while having only one sixth of its weight. These changes in properties, of which we are only in our infancy of discovering, are what drive the interest in nanoscale devices. While quantum physics calculations had long suggested that potentially useful, even revolutionary, devices could be made at the nanoscale level, construction of them was not really possible until the invention of the Scanning Tunneling Microscope (STM) in 1981. The STM works on another fascinating principle in quantum physics called quantum tunneling which, for the sake of brevity and to spare myself the challenge, I will leave unexplained. The beauty of the invention of the STM was the ability to image materials at the nanoscale level so you can see the individual atoms. Without this ability you could not “see” the nanomaterials you were attempting to construct. With the STM in hand, research and entrepreneurship in nanotechnology took off. Current applications of nanomaterials include titanium dioxide particles in your sunscreen which are dramatically better at blocking UV rays compared to larger titanium dioxide particles, and the silver, bacteria-killing nanotubes which have been added to Band-Aids® to promote healing and socks to cut down on foot odor. You might think that silver is a rather expensive material to use for something as mundane as foot odor control. However, since the effectiveness of the silver nanotubes is directly dependent on their extraordinarily small size, a minute amount goes a long way. We’ll touch on silver nanotubes again later. The potential for new inventions based on nanotechnology is hard to exaggerate. I like to think of it this way. Since the origin of our species millions of years ago, we have been utilizing the macro-scale (>100 nm) materials available to us on Earth to spectacular effect. Think rockets, iPads®, artificial joints and Cheez Whiz®. But with the advent of nanomaterials and their surprising and unique properties, it’s almost as though we have an entire new planet of materials to work with. There are far too many potential applications of nanotechnology to include a comprehensive review, so here are several that I find intriguing. - Nanoscale devices may result in dramatically improved solar cells. An improvement of solar cell efficiency from the current level of 20% to 40-50% would transform global electricity production. - Quantum computers based on instantaneous quantum entanglement could result in machines that make today’s supercomputers look like the Model T. - Nano-robots could be launched in the upper atmosphere to rebuild the ozone layer. - Nanotechnology could provide dramatic improvements in fighting cancer. One approach I find promising is the construction of gold, chemoactive agent containing that have receptors which can bind to cancer cells. With this approach, chemotherapy could be delivered directly to tumors while sparing patients the ravages of having these toxic drugs coursing through their entire bodies. Unfortunately, the same properties that make nanomaterials attractive, their small size and physical and chemical reactivity, make them potentially dangerous. Remember the socks with the silver, antibacterial nanotubes? When you run them through the laundry, some of the tubes wash off and enter local water-ways where they kill beneficial bacteria. The production and use of nanomaterials also presents significant respiratory risks. We already know that inhaling particulates from coal mining, smoking, or asbestos can cause lung cancer. Breathing in tiny, chemically-active nanoparticles would likely be far worse, a concern which is supported by laboratory studies with mice. In the final analysis, nanotechnology shares the same characteristics as most new scientific developments, the potential to result in both great benefit and significant harm. Our challenge as a society will be to utilize them responsibly. Undoubtedly, we will make some mistakes along the way, but in the end the results are likely to amaze. Have a comment, question, or prediction about nanotechnology? Use the comment interface below or send me an e-mail to email@example.com.
<urn:uuid:fbb62987-36e8-4bc1-9f0b-65e485a3ec4f>
CC-MAIN-2014-15
http://chapelboro.com/columns/common-science/nanotechnology/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609539665.16/warc/CC-MAIN-20140416005219-00634-ip-10-147-4-33.ec2.internal.warc.gz
en
0.943752
1,258
3.6875
4
Researchers from the European Space Agency and Simon Fraser University in British Columbia have been working to develop a robot sticky enough to cling safely to the outside of a spacecraft while also remaining mobile. At this point, the robot, dubbed Abigaille, is able to climb walls on Earth. “This approach is an example of ‘biomimicry,’ taking engineering solutions from the natural world,” said Michael Henrey, a graduate student in engineering at Simon Fraser and a researcher on the project. “Our Abigaille climbing robot is therefore quite dexterous, with six legs each having four degrees of freedom [or joints], so it should be able to handle environments that a wheeled robot could not.” He added that the robot can transition from a vertical position to horizontal, which could be useful for navigating around the surface of a satellite or maneuvering around obstacles. For the lizard-like robot, the European Space Agency said it’s taking a lesson from the hairs on the bottom of the gecko’s feet that enable it to stick to surfaces. “We’ve borrowed techniques from the microelectronics industry to make our own footpad terminators,” Henrey said in a statement. “Technical limitations mean these are around 100 times larger than a gecko’s hairs, but they are sufficient to support our robot’s weight.” The agency has tested the robot to see if it could work in the rigors of a space environment. “The reason we’re interested in dry adhesives is that other adhesive methods wouldn’t suit the space environment,” said Henrey. “Scotch, duct or pressure-sensitive tape would collect dust, reducing their stickiness over time… Velcro requires a mating surface, and broken hooks could contaminate the robot’s working environment. Magnets can’t stick to composites, for example, and magnetic fields might affect sensitive instruments.” It’s not uncommon for robotics researchers to build machines based on animals or even insects. In November, scientists at New York University said they had built a small, flying robot to move like the boneless, pulsating, water-dwelling jellyfish. Last spring, Harvard University researchers announced that they had built an insect-like robot that flies by flapping its wings. The robot is so small it has about 1/30th the weight of a U.S. penny. In the fall of 2012, scientists at the University of Sheffield and the University of Sussex in England teamed up to study the brains of honey bees in an attempt to build an autonomous flying robot. Seasonally adjusted, that would be down 1-2 per cent on a monthly basis and mean that actual chip sales will likely fall 15-16 per cent on a yearly basis. The reason for the fall, the analysts say, is due to disk drive shortages in Thailand which have forced costs to rise. The PC market is likely to be more back-loaded this year, the report notes. Handset chip sales were likely also soft in January. Chips for cars were softer after a strong December. Other quirks, such as an early Chinese New Year also contributed the low figures in January. Although several chip makers indicated the inventory problems in fourth quarter had ended, Carnegie thinks that the indicator shows that the trend will continue into this year. PC’s are the biggest chip users, followed by cell phones. Cars, appliances, base stations, and instruments are other significant users, the analyst said. The latest fad of using SoC (System-o-Chip) processor will be incorporated into the new Slim Xbox 360 according to Microsoft, which cuts down on two processors. According to Microsoft the chip was designed by IBM/Global Foundries is using a 45nm process and combines the tri-core CPU, AMD/ATI GPU, dual channel memory controller, and I/O onto a single chip with a new front side bus. This technological design is similar to the methods used by AMD’s Fusion and Intel’s Sandy Bridge offerings. As you the true reason for Microsoft to use SoC is to reduce cost. That said, it also reduces heat and increases power efficiency; these are two areas that Microsoft has improved upon with each generation of Xbox 360 that has been released. The new SoC will have 372 million transistors that took Microsoft development team 5 years of research to bring to life. It is said that Microsoft wanted to pay special attention to guaranteeing compatibility, implemented precision latency and bandwidth throttling that perfectly impersonates the older Xbox systems which used separate chips to make up older XBOX 360s. Now I wonder if Microsoft will drop the Xbox 360 price even more in the Fall. The tiny computer is being called the Phoenix chip, its size is 1 cubic millimeter and was made to be used in the human eye. The little computer does not have a lot on its plate. The Phoenix has the job of monitoring the intraocular pressure of glaucoma patients, do not be fooled by the assumed simple task, the device is considered a computer by all technical standards. Researcher Dennis Sylvester, a professor at the University of Michigan says the Phoenix computer comprised of an ultra-low-power microprocessor, a pressure sensor, memory, a ultra slim battery, a solar cell and a wireless radio with an antenna that can transmit data to an external device. The Phoenix amazingly uses only 5.3 nanowatts while in use, otherwise it sleeps. The researchers profess that such tiny computers will one day be utilized to track pollution, monitor structural integrity, perform surveillance, or make virtually any object smart and track-able. We are always glad to see Universities lead with amazing research to make our lives better. IBM is breathing new life into a quantum computing research division at its Thomas J. Watson Research Center, reports New York Times. The computer giant has hired alumni from promising quantum computing programs at Yale and the University of California-Santa Barbara, both of which made quantum leaps in the past year using standard superconducting material. Groups at both universities have been using rhenium or niobium on a semiconductor surface and cooling the system to absolute zero so that it exhibits quantum behavior. As the Times reports, the method relies on standard microelectronics manufacturing tech, which could make quantum computers easier and cheaper to make. The Santa Barbara researchers told the Times they believe they can double the computational power of their quantum computers by next year. Rather than using transistors to crunch the ones and zeroes of binary code, quantum computers store data as qubits, which can represent one and zero simultaneously. This superposition enables the computers to solve multiple problems at once, providing quick answers to tough questions. But observing a qubit strips it of this duality — you can only see one state at a time — so physicists must figure out how to extract data from a qubit without directly observing it. That’s where quantum entanglement comes in handy; two qubits can be connected by an invisible wave so that they share each other’s properties. You could then watch one qubit to see what its twin is computing. None of this is simple, however; there are several competing methods for making the qubits, including laser-entangled ions, LED-powered entangled photons, and more. Google is working with a Canadian firm called D-Wave that has claimed 50-qubit computers, although skeptics have questioned that number. In most systems, the number of entangled qubits remains small, but Yale researchers believe they will increase in the next few years, the Times says. Even better: with all this practice, physicists are getting a lot better at controlling quantum interactions. Their precision has increased a thousand-fold, one researcher said. That’s good news for anyone studying quantum mechanics.
<urn:uuid:3c7c2c40-bc9d-4635-bd17-80da4feec8db>
CC-MAIN-2014-15
http://www.thegurureview.net/tag/microelectronics
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223211700.16/warc/CC-MAIN-20140423032011-00366-ip-10-147-4-33.ec2.internal.warc.gz
en
0.941003
1,637
3.875
4
For a scientist whose career was made by his work on black holes, it might seem a little confusing to read that Stephen Hawking now thinks that they don’t exist. But that’s what “Information Preservation and Weather Forecasting for Black Holes,” the study Hawking published last week on arXiv, says: “there are no black holes.” While this might seem surprising–after all, there’s a huge amount of (indirect) evidence that black holes exist, including a massive one several million times the mass of our Sun at the centre of the Milky Way—it’s really not. It’s Hawking’s latest attempt to solve a paradox that he, and other astrophysicists, have been grappling with for a couple of years. So what’s he talking about? Here’s the background: black holes are objects which are so massive, with such strong gravity, that even light can’t escape. The distance from the black hole, beyond which nothing gets out, is the event horizon. However, Hawking made his name in the 1970s when he published a paper showing that black holes don’t just suck stuff up, endlessly—they spew out a beam of so-called “Hawking radiation” as they absorb other matter. That means black holes actually lose mass over time, eventually whittling away to nothing. Black holes are frustrating, though, because their extreme gravity exposes the major inadequacy in our current scientific understanding of the universe - we don’t know how to reconcile quantum mechanics and general relativity. With general relativity, we can make accurate predictions about objects with certainty, but on the tiny scale of quantum mechanics it’s only possible to talk about the behaviour of objects in terms of probability. When we do the maths on what happens to things that fall into black holes, using relativity gives results that break quantum mechanics; the same goes vice versa. One of the key things about quantum mechanics is that it tells us information can’t be destroyed–that is, if you measure the radiation given off by a black hole, you should be able to build up a picture of what matter fell into the hole to create it. However, if general relativity holds, and nothing can escape from inside the event horizon, then that should apply to that quantum information–any radiation that’s coming out is, Hawking showed, random. It’s the black hole “information paradox.” Either give up quantum mechanics, or accept that information can die. Hawking was in the “information can die” camp, until 2004, when it became clear—thanks to string theory—that quantum mechanics held up (and there’s an excellent in-depth explanation of this in Nature that explores this story more fully if interested). There was just one problem—nobody could work out *how* information was getting out of black holes, even if it was happening mathematically. And, just in case this wasn’t all entirely confusing, it turns out that our best post-2004 theory about what’s been going on gives rise to an entirely new paradox—the “firewall.” It’s to do with quantum entanglement, where two particles are created that are identical on the quantum level. The way it works isn’t exactly clear yet—it could be something to do with string theory and wormholes—but it means that measuring the properties of one particle will give readings that mirror those found on its entangled particle. It might lead to teleportation technology, but scientists aren’t sure yet. Joseph Polchinski from the Kavli Institute for Theoretical Physics in Santa Barbara, California published a paper in 2012 that worked out the information paradox could be solved if Hawking radiation was quantum entangled with the stuff falling in. But, due to the limitations of entanglement, if this is true, that would mean that at the event horizon a massive amount of energy was given off by particles entering and leaving. Hence “firewall”—anything crossing the event horizon would be burnt to a crisp. And even though most scientists, including Polchinski, thought this couldn’t possibly be right—it completely contradicts a lot of the stuff underlying general relativity, for example—nobody’s yet managed to disprove it. The choice for physicists, once again, was to: a) accept the firewall, and throw out general relativity, or b) accept that information dies in black holes, and quantum mechanics is wrong. Still with me? Here’s where Hawking’s latest paper comes in. (That title—“Information Preservation and Weather Forecasting for Black Holes”—might make some more sense too, hopefully.) Hawking’s proposed solution, building on an idea first floated in 2005, is that the event horizon isn’t as defined as we’ve come to imagine it. He instead proposes something called an “apparent horizon,” which light and other stuff can escape from: "The absence of event horizons mean that there are no black holes—in the sense of regimes from which light can't escape to infinnity. There are however apparent horizons which persist for a period of time." Black holes should be treated more like massive galactic washing machines. Stuff falls in and starts getting tossed around, mixed up with other stuff in there, and only eventually is allowed to escape out again when ready. This happens because the quantum effects around a black hole, like weather on Earth, churn so violently and unpredictably that it’s just impossible to either predict the position of an event horizon or expect uniform effects for stuff crossing it. While the theoretical basis, that information is preserved, remains, in practice it's so difficult as to be impractical. It’s a fudge of an idea, which tries to have its general relativity and quantum mechanics cakes, and eat them, too. Possible weaknesses, as Nature points out, are that it could imply that escaping from black holes is easier than it is in reality. It could also be the apparent horizons are just as much of a firewall as the traditional conception of an event horizon. Hawking's peers have yet to have a go at assessing his idea, so we'll have to wait to see whether the idea has merit—or whether it merely gives rise to yet more paradoxes. This piece first appeared on newstatesman.com. Image via Shutterstock.
<urn:uuid:da271c93-3944-4216-8474-8a15a071bc29>
CC-MAIN-2014-15
http://www.newrepublic.com/node/116442/print
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609523429.20/warc/CC-MAIN-20140416005203-00191-ip-10-147-4-33.ec2.internal.warc.gz
en
0.947909
1,360
3.65625
4
The latest news from academia, regulators research labs and other things of interest Posted: Dec 23, 2013 Graphene can host exotic new quantum electronic states at its edges (Nanowerk News) Graphene has become an all-purpose wonder material, spurring armies of researchers to explore new possibilities for this two-dimensional lattice of pure carbon. But new research at MIT has found additional potential for the material by uncovering unexpected features that show up under some extreme conditions — features that could render graphene suitable for exotic uses such as quantum computing. On a piece of graphene (the horizontal surface with a hexagonal pattern of carbon atoms), in a strong magnetic field, electrons can move only along the edges, and are blocked from moving in the interior. In addition, only electrons with one direction of spin can move in only one direction along the edges (indicated by the blue arrows), while electrons with the opposite spin are blocked (as shown by the red arrows). Under typical conditions, sheets of graphene behave as normal conductors: Apply a voltage, and current flows throughout the two-dimensional flake. If you turn on a magnetic field perpendicular to the graphene flake, however, the behavior changes: Current flows only along the edge, while the bulk remains insulating. Moreover, this current flows only in one direction — clockwise or counterclockwise, depending on the orientation of the magnetic field — in a phenomenon known as the quantum Hall effect. In the new work, the researchers found that if they applied a second powerful magnetic field — this time in the same plane as the graphene flake — the material’s behavior changes yet again: Electrons can move around the conducting edge in either direction, with electrons that have one kind of spin moving clockwise while those with the opposite spin move counterclockwise. “We created an unusual kind of conductor along the edge,” says Young, a Pappalardo Postdoctoral Fellow in MIT’s physics department and the paper’s lead author, “virtually a one-dimensional wire.” The segregation of electrons according to spin is “a normal feature of topological insulators,” he says, “but graphene is not normally a topological insulator. We’re getting the same effect in a very different material system.” What’s more, by varying the magnetic field, “we can turn these edge states on and off,” Young says. That switching capability means that, in principle, “we can make circuits and transistors out of these,” he says, which has not been realized before in conventional topological insulators. There is another benefit of this spin selectivity, Young says: It prevents a phenomenon called “backscattering,” which could disrupt the motion of the electrons. As a result, imperfections that would ordinarily ruin the electronic properties of the material have little effect. “Even if the edges are ‘dirty,’ electrons are transmitted along this edge nearly perfectly,” he says. Jarillo-Herrero, the Mitsui Career Development Associate Professor of Physics at MIT, says the behavior seen in these graphene flakes was predicted, but never seen before. This work, he says, is the first time such spin-selective behavior has been demonstrated in a single sheet of graphene, and also the first time anyone has demonstrated the ability “to transition between these two regimes.” That could ultimately lead to a novel way of making a kind of quantum computer, Jarillo-Herrero says, something that researchers have tried to do, without success, for decades. But because of the extreme conditions required, Young says, “this would be a very specialized machine” used only for high-priority computational tasks, such as in national laboratories. Ashoori, a professor of physics, points out that the newly discovered edge states have a number of surprising properties. For example, although gold is an exceptionally good electrical conductor, when dabs of gold are added to the edge of the graphene flakes, they cause the electrical resistance to increase. The gold dabs allow the electrons to backscatter into the oppositely traveling state by mixing the electron spins; the more gold is added, the more the resistance goes up. This research represents “a new direction” in topological insulators, Young says. “We don’t really know what it might lead to, but it opens our thinking about the kind of electrical devices we can make.” The experiments required the use of a magnetic field with a strength of 35 tesla — “about 10 times more than in an MRI machine,” Jarillo-Herrero says — and a temperature of just 0.3 degrees Celsius above absolute zero. However, the team is already pursuing ways of observing a similar effect at magnetic fields of just one tesla — similar to a strong kitchen magnet — and at higher temperatures. Philip Kim, a professor of physics at Columbia University who was not involved in this work, says, “The authors here have beautifully demonstrated excellent quantization of the conductance,” as predicted by theory. He adds, “This is very nice work that may connect topological insulator physics to the physics of graphene with interactions. This work is a good example how the two most popular topics in condensed matter physics are connected each other.” Source: By David L. Chandler, MIT If you liked this article, please give it a quick review on reddit or StumbleUpon. Thanks! Check out these other trending stories on Nanowerk:
<urn:uuid:fdfe3feb-1b60-4bda-b81f-6c600abde467>
CC-MAIN-2014-15
http://www.nanowerk.com/nanotechnology_news/newsid=33809.php
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609536300.49/warc/CC-MAIN-20140416005216-00258-ip-10-147-4-33.ec2.internal.warc.gz
en
0.94076
1,168
3.53125
4
A crucial step in a procedure that could enable future quantum computers to break today’s most commonly used encryption codes has been demonstrated by physicists at the U.S. Commerce Department’s National Institute of Standards and Technology (NIST). Image: This colorized image shows the fluorescence from three trapped beryllium ions illuminated with an ultraviolet laser beam. Black and blue areas indicate lower intensity, and red and white higher intensity. As reported in the May 13 issue of the journal Science,* the NIST team showed that it is possible to identify repeating patterns in quantum information stored in ions (charged atoms). The NIST work used three ions as quantum bits (qubits) to represent 1s or 0s—or, under the unusual rules of quantum physics, both 1 and 0 at the same time. Scientists believe that much larger arrays of such ions could process data in a powerful quantum computer. Previous demonstrations of similar processes were performed with qubits made of molecules in a liquid, a system that cannot be expanded to large numbers of qubits. “Our demonstration is important, because it helps pave the way toward building a large-scale quantum computer,� says John Chiaverini, lead author of the paper. “Our approach also requires fewer steps and is more efficient than those demonstrated previously.� The NIST team used electromagnetically trapped beryllium ions as qubits to demonstrate a quantum version of the “Fourier transform� process, a widely used method for finding repeating patterns in data. The quantum version is the crucial final step in Shor’s algorithm, a series of steps for finding the “prime factors� of large numbers—the prime numbers that when multiplied together produce a given number. Developed by Peter Shor of Bell Labs in 1994, the factoring algorithm sparked burgeoning interest in quantum computing. Modern cryptography techniques, which rely on the fact that even the fastest supercomputers require very long times to factor large numbers, are used to encode everything from military communications to bank transactions. But a quantum computer using Shor’s algorithm could factor a number several hundred digits long in a reasonably short time. This algorithm made code breaking the most important application for quantum computing. Quantum computing, which harnesses the unusual behavior of quantum systems, offers the possibility of parallel processing on a grand scale. Unlike switches that are either fully on or fully off in today’s computer chips, quantum bits can be on, off, or on and off at the same time. The availability of such “superpositions,� in addition to other strange quantum properties, means that a quantum computer could solve certain problems in an exponentially shorter time than a conventional computer with the same number of bits. Researchers often point out that, for specific classes of problems, a quantum computer with 300 qubits has potentially more processing power than a classical computer containing as many bits as there are particles in the universe. Harnessing all this potential for practical use is extremely difficult. One problem is that measuring a qubit causes its delicate quantum state to collapse, producing an output of an ordinary 1 or 0, without a record of what happened during the computation. Nevertheless, Shor’s algorithm uses these properties to perform a useful task. It enables scientists to analyze the final quantum state after the computation to find repeating patterns in the original input, and to use this information to determine the prime factors of a number. The work described in the Science paper demonstrated the pattern-finding step of Shor’s algorithm. This demonstration involves fewer and simpler operations than those previously implemented, a significant benefit in designing practical quantum computers. In the experiments, NIST researchers performed the same series of operations on a set of three beryllium qubits thousands of times. Each set of operations lasted less than 4 milliseconds, and consisted of using ultraviolet laser pulses to manipulate individual ions in sequence, based on measurements of the other ions. Each run produced an output consisting of measurements of each of the three ions. The NIST team has the capability to measure ions’ quantum states precisely and use the results to manipulate other ions in a controlled way, before the delicate quantum information is lost. The same NIST team has previously demonstrated all the basic components for a quantum computer using ions as qubits, arguably a leading candidate for a large-scale quantum processor. About a dozen different types of quantum systems are under investigation around the world for use in quantum processing, including the approach of using ions as qubits. The new work was supported in part by the Advanced Research and Development Activity/National Security Agency. As a non-regulatory agency, NIST develops and promotes measurement, standards and technology to enhance productivity, facilitate trade and improve the quality of life. Explore further: Chemical vapor deposition used to grow atomic layer materials on top of each other
<urn:uuid:07afc277-73ad-4ba1-afeb-7f32e2e02518>
CC-MAIN-2014-15
http://phys.org/news4082.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609526311.33/warc/CC-MAIN-20140416005206-00210-ip-10-147-4-33.ec2.internal.warc.gz
en
0.91935
1,035
4.125
4
First Electronic Quantum Processor Created 2009 07 01 A team led by Yale University researchers has created the first rudimentary solid-state quantum processor, taking another step toward the ultimate dream of building a quantum computer. The two-qubit processor is the first solid-state quantum processor that resembles a conventional computer chip and is able to run simple algorithms. (Credit: Blake Johnson/Yale University) They also used the two-qubit superconducting chip to successfully run elementary algorithms, such as a simple search, demonstrating quantum information processing with a solid-state device for the first time. Their findings appeared in Nature's advanced online publication June 28. "Our processor can perform only a few very simple quantum tasks, which have been demonstrated before with single nuclei, atoms and photons," said Robert Schoelkopf, the William A. Norton Professor of Applied Physics & Physics at Yale. "But this is the first time they've been possible in an all-electronic device that looks and feels much more like a regular microprocessor." Working with a group of theoretical physicists led by Steven Girvin, the Eugene Higgins Professor of Physics & Applied Physics, the team manufactured two artificial atoms, or qubits ("quantum bits"). While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states. These states are akin to the "1" and "0" or "on" and "off" states of regular bits employed by conventional computers. Because of the counterintuitive laws of quantum mechanics, however, scientists can effectively place qubits in a "superposition" of multiple states at the same time, allowing for greater information storage and processing power. For example, imagine having four phone numbers, including one for a friend, but not knowing which number belonged to that friend. You would typically have to try two to three numbers before you dialed the right one. A quantum processor, on the other hand, can find the right number in only one try. "Instead of having to place a phone call to one number, then another number, you use quantum mechanics to speed up the process," Schoelkopf said. "It's like being able to place one phone call that simultaneously tests all four numbers, but only goes through to the right one." These sorts of computations, though simple, have not been possible using solid-state qubits until now in part because scientists could not get the qubits to last long enough. While the first qubits of a decade ago were able to maintain specific quantum states for about a nanosecond, Schoelkopf and his team are now able to maintain theirs for a microsecond—a thousand times longer, which is enough to run the simple algorithms. To perform their operations, the qubits communicate with one another using a "quantum bus"—photons that transmit information through wires connecting the qubits—previously developed by the Yale group. The key that made the two-qubit processor possible was getting the qubits to switch "on" and "off" abruptly, so that they exchanged information quickly and only when the researchers wanted them to, said Leonardo DiCarlo, a postdoctoral associate in applied physics at Yale's School of Engineering & Applied Science and lead author of the paper. Next, the team will work to increase the amount of time the qubits maintain their quantum states so they can run more complex algorithms. They will also work to connect more qubits to the quantum bus. The processing power increases exponentially with each qubit added, Schoelkopf said, so the potential for more advanced quantum computing is enormous. But he cautions it will still be some time before quantum computers are being used to solve complex problems. "We're still far away from building a practical quantum computer, but this is a major step forward." Authors of the paper include Leonardo DiCarlo, Jerry M. Chow, Lev S. Bishop, Blake Johnson, David Schuster, Luigi Frunzio, Steven Girvin and Robert Schoelkopf (all of Yale University), Jay M. Gambetta (University of Waterloo), Johannes Majer (Atominstitut der Österreichischen Universitäten) and Alexandre Blais (Université de Sherbrooke). Article source: ScienceDaily.com Jim Elvidge - Programmed Reality, The Power of 10, Science & The Soul Nick Begich - Mind Control & Emerging Technologies A short Introduction to Quantum Computation Is Quantum Mechanics Controlling Your Thoughts? How Time-Traveling Could Affect Quantum Computing Nano-Diamonds Might Lead to Quantum Computing 'Light trap' is a Step Towards Quantum Memory Latest News from our Front Page Cyclopean Masonry: A Mystery of the Ancient World 2014 04 16 They don’t make things like they used to, and that is, in some cases, a monumental understatement. Silly wordplay notwithstanding, there is something to be said for the construction techniques of the old world. Where modern buildings are designed to withstand the elements; wind, temperature extremes, earthquakes and floods, today’s engineers have to strike a balance between economics ... Megalithic Origins : Ancient connections between Göbekli Tepe and Peru 2014 04 16 At 6,500 years older than Stonehenge and 7,000 years before the pyramids were constructed, a cult megalithic complex sat atop the hills near current day Sanliurfa, in southeast Turkey. Göbekli Tepe was flourishing an astonishing 12,000 - 14,000 years ago, and today, the preserved remains still exhibits high degrees of sophistication and megalithic engineering skill. Back in the 1990’s when ... Department of Transportation Uses LRAD Sound Cannons Against Drivers 2014 04 16 The Missouri Department of Transportation revealed two newly acquired LRAD sound cannons this week, which will reportedly be used to target vehicles that speed in work zones. Coming in at $25,000 a piece, the Long-Range Acoustic Device, a sonic weapon best know for its use against protesters and insurgents in Afghanistan, will alert drivers to road conditions by shooting a loud ... An ’Unknown Holocaust’ and the Hijacking of History 2014 04 16 An address by Mark Weber, director of the Institute for Historical Review, delivered at an IHR meeting in Orange County, California, on July 25, 2009. (A report on the meeting is posted here.) We hear a lot about terrible crimes committed by Germans during World War II, but we hear very little about crimes committed against Germans. Germany’s defeat in May ... Ex-Mayor Bloomberg Starting $50 Million Gun-Control Network 2014 04 16 Former New York Mayor Michael Bloomberg ramped up his efforts to fight gun violence on Wednesday with a plan to spend $50 million on a grassroots network to organize voters on gun control. The initiative’s political target is the powerful pro-gun lobby, including the National Rifle Association, that spends millions of dollars each year to back gun-rights supporters. Bloomberg’s group, called Everytown ... |More News » |
<urn:uuid:6f60b217-3242-4d4f-a38e-7f52cf696a70>
CC-MAIN-2014-15
http://www.redicecreations.com/article.php?id=6996
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609527423.39/warc/CC-MAIN-20140416005207-00545-ip-10-147-4-33.ec2.internal.warc.gz
en
0.916374
1,498
3.859375
4
(PhysOrg.com) -- Until now, scientists have thought that the process of erasing information requires energy. But a new study shows that, theoretically, information can be erased without using any energy at all. Instead, the cost of erasure can be paid in terms of another conserved quantity, such as spin angular momentum. In the study, physicists Joan Vaccaro from Griffith University in Queensland, Australia, and Stephen Barnett from the University of Strathclyde in Glasgow, UK, have quantitatively described how information can be erased without any energy, and they also explain why the result is not as contentious as it first appears. Their paper is published in a recent issue of the Proceedings of the Royal Society A. Traditionally, the process of erasing information requires a cost that is calculated in terms of energy more specifically, heat dissipation. In 1961, Rolf Landauer argued that there was a minimum amount of energy required to erase one bit of information, i.e. to put a bit in the logical zero state. The energy required is positively related to the temperature of the systems thermal reservoir, and can be thought of as the systems thermodynamic entropy. As such, this entropy is considered to be a fundamental cost of erasing a bit of information. However, Vaccaro and Barnett have shown that an energy cost can be fully avoided by using a reservoir based on something other than energy, such as spin angular momentum. Subatomic particles have spin angular momentum, a quantity that, like energy, must be conserved. Basically, instead of heat being exchanged between a qubit and thermal reservoir, discrete quanta of angular momentum are exchanged between a qubit and spin reservoir. The scientists described how repeated logic operations between the qubits spin and a secondary spin in the zero state eventually result in both spins reaching the logical zero state. Most importantly, the scientists showed that the cost of erasing the qubits memory is given in terms of the quantity defining the logic states, which in this case is spin angular momentum and not energy. The scientists explained that experimentally realizing this scheme would be very difficult. Nevertheless, their results show that physical laws do not forbid information erasure with a zero energy cost, which is contrary to previous studies. The researchers noted that, in practice, it will be especially difficult to ensure the systems energy degeneracy (that different spin states of the qubit and reservoir have the exact same energy level). But even if imperfect conditions cause some energy loss, there is no fundamental reason to assume that the cost will be as large as that predicted by Landauers formula. The possibility of erasing information without using energy has implications for a variety of areas. One example is the paradox of Maxwells demon, which appears to offer a way of violating the second law of thermodynamics. By opening and closing a door to separate hot and cold molecules, the demon supposedly extracts work from the reservoir, converting all heat into useful mechanical energy. Bennetts resolution of the paradox in 1982 argues that the demons memory has to be erased to complete the cycle, and the cost of erasure is at least as much as the liberated energy. However, Vaccaro and Barnetts results suggest that the demons memory can be erased at no energy cost by using a different kind of reservoir, where the cost would be in terms of spin angular momentum. In this scheme, the demon can extract all the energy from a heat reservoir as useful energy at a cost of another resource. As the scientists explained, this result doesn't contradict historical statements of the second law of thermodynamics, which are exclusively within the context of heat and thermal reservoirs and do not allow for a broader class of reservoirs. Moreover, even though the example with Maxwells demon suggests that mechanical work can be extracted at zero energy cost, this extraction is associated with an increase in the information-theoretic entropy of the overall system. The maximization of entropy subject to a constraint need apply not only to heat reservoirs and the conservation of energy, Vaccaro explained to PhysOrg.com. The results could also apply to hypothetical Carnot heat engines, which operate at maximum efficiency. If these engines use angular momentum reservoirs instead of thermal reservoirs, they could generate angular momentum effort instead of mechanical work. As for demonstrating the concept of erasing information at zero energy cost, the scientists said that it would take more research and time. We are currently looking at an idea to perform information erasure in atomic and optical systems, but it needs much more development to see if it would actually work in practice, Vaccaro said. She added that the result is of fundamental significance, and its not likely to have practical applications for memory devices. We don't see this as having a direct impact in terms of practical applications, because the current energy cost of information erasure is nowhere near Landauer's theoretical bound, she said. It's more a case of what it says about fundamental concepts. For example, Landauer said that information is physical because it takes energy to erase it. We are saying that the reason it is physical has a broader context than that. Explore further: Better thermal-imaging lens from waste sulfur More information: Joan A. Vaccaro and Stephen M. Barnett. Information erasure without an energy cost. Proceedings of the Royal Society A. DOI:10.1098/rspa.2010.0577
<urn:uuid:9d3de6cd-28c3-4601-a285-3f914a03e2d8>
CC-MAIN-2014-15
http://phys.org/news/2011-01-scientists-erase-energy.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609533308.11/warc/CC-MAIN-20140416005213-00571-ip-10-147-4-33.ec2.internal.warc.gz
en
0.938162
1,096
3.546875
4
Entangled photons are the tools of choice for testing the foundations of quantum physics and demonstrating the teleportation of quantum states. They are also a resource in quantum information protocols. So far, most work has focused on entangled photons at optical frequencies. But for several decades, researchers have been developing a quantum information technology based on superconducting circuits, which have quantized excitations in the form of microwave photons. These circuits are attractive for generating, processing, and storing quantum information because they can be lithographically patterned on a small chip, allowing a good design to be replicated many times over. Now, in Physical Review Letters, Emmanuel Flurin and colleagues at École Normale Supérieure in Paris report they can generate, and then spatially separate, entangled microwave fields in a superconducting circuit , enabling quantum teleportation schemes that utilize microwave technology. Entangled objects (such as two photons) are described by a common, nonseparable quantum-mechanical state. Measuring one object instantly changes the common state and therefore instantly affects the other object, even if it is far away—a phenomenon that seems to fly in the face of special relativity. As a concept, entanglement got off to a rocky start. In 1935, Einstein, Podolsky, and Rosen (EPR) concluded that the “action at a distance” allowed by entanglement was so bizarre that quantum mechanics must be an incomplete theory . They and others proposed the alternate theory of local hidden variables, which (loosely speaking) assumes entangled objects decide on a set of variables before they separate. A clear resolution didn’t emerge until 1964, when John Bell showed that the hidden variable theory set an upper bound on the degree of correlation between particles in an entangled state . In contrast, quantum mechanics predicted that correlations for certain states exceeded, or “violated”, this upper bound. Violations of Bell’s inequality emerged again and again in quantum optics experiments, confirming that quantum entanglement was real. Still, the quest to find a loophole has continued to this day. Even though it started as a suspicious property, by the nineties, entanglement was considered a technologically important resource for quantum cryptography and quantum computation. Other applications grew out of the fact that entangled photons can behave like an effective wave with half the photons’ wavelength, a feature that can be used to perform interferometry at sub-shot noise levels. One advantage of using microwave superconducting circuits to test quantum physics and explore its applications is that the interaction between microwave photons is large compared to that found in optical photon technologies. These lithographically defined circuits are made from metals, like aluminum, which become superconducting when cooled to low temperatures. A central component within the circuit is the Josephson junction, a thin barrier that separates two superconducting stretches of metal. On either side of a Josephson junction, the Cooper-pair condensate is described by a macroscopic quantum mechanical wave function with an amplitude and a phase. The current through the junction is a sinusoidal function of the phase difference across the junction, which is in turn proportional to the time-integral of the voltage across the junction. As a result, unlike capacitors and inductors, Josephson junctions have a nonlinear response to currents, which makes it possible to engineer circuits that amplify or change the frequency of signals. Even though they are macroscopic objects, superconducting circuits support quantized excitations of the electromagnetic field . Researchers at NEC in Japan have used superconducting circuits to make a quantum bit that could remain in a superposition of two charge states for a few nanoseconds. In the last few years, experimentalists at the University of California, Santa Barbara , have demonstrated a nine-element solid-state quantum processor that can factor the number , while researchers at IBM have observed coherent life times of a single superconducting qubit of up to microseconds . Building on this superconducting technology for quantum information processing, Flurin et al. take a first step towards engineering the circuits for quantum communication. The circuit they use for generating the entanglement is essentially a parametric amplifier, an ultralow-noise device that amplifies a quantum signal. In this case, they use it to amplify vacuum fluctuations (fluctuations that exist because the circuit is a quantum object) to create a so-called squeezed state. To do this, the team fabricated a chip (Fig. 1) that consists of two thin, serpentine aluminum channels that act as microwave resonators with different resonant frequencies (). These resonators are coupled by a nonlinear circuit element consisting of several Josephson junctions. A third aluminum channel pumps coherent microwaves with frequency into the nonlinear crossing, which converts a pumped photon into a pair of photons, one in each resonator, whose frequencies and add up to the pump frequency. Since both photons originate from the same pump photon, they are correlated in a specific way, called two-mode squeezing , with the phase reference coming from the pump. The entanglement is then detected using a second device, similar to the one used to entangle the photons. With their device, Flurin et al. are able to produce pairs of entangled photons at a rate corresponding to six million entangled bits per second. Two-mode squeezing of microwaves has been observed before , but it was at two different frequencies in a single transmission line. Flurin et al. are the first to create a two-mode squeezed microwave field where the two modes are spatially separated, thus showing entanglement in the original EPR sense. Taking a different approach, researchers in Germany, Spain, and Japan are now reporting on the arXiv that they can produce spatially separated entangled microwave photons at a single frequency. By demonstrating an efficient way to produce a flow of spatially separated entangled microwave photons, Flurin et al. have opened the door towards a new set of on-chip experiments in quantum information and measurement. One of the next steps will certainly be to demonstrate quantum teleportation with microwaves. Long term, researchers will look to interface microwave circuits, which are efficient at generating strongly interacting photons, with the fiber optic technology that works so well for sending light over long distances. - E. Flurin, N. Roch, F. Mallet, M. H. Devoret, and B. Huard, “Generating Entangled Microwave Radiation Over Two Transmission Lines,” Phys. Rev. Lett. 109, 183901 (2012). - A. Einstein, B. Podolsky, and N. Rosen, “Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?,” Phys. Rev. 47, 777 (1935). - J. Bell, “On the Einstein Podolsky Rosen Paradox,” Physics 1, 195 (1964). - M. Devoret et al., “Measurements of Macroscopic Quantum Tunneling out of the Zero-Voltage State of a Current-Biased Josephson Junction,” Phys. Rev. Lett. 55, 1908 (1985). - Y. Nakamura, Yu. A. Pashkin, and J. S. Tsai, “Coherent Control of Macroscopic Quantum States in a Single-Cooper-Pair Box,” Nature 398, 786 (1999). - E. Lucero et al., “Computing prime factors with a Josephson phase qubit quantum processor,” Nature Phys. 8, 719 (2012). - C. Rigetti et al., “Superconducting Qubit in a Waveguide Cavity with a Coherence Time Approaching 0.1 ms,” Phys. Rev. B 86, 100506 (2012). - C. M. Caves and B. L. Schumaker “New Formalism for Two-Photon Quantum Optics. I. Quadrature Phases and Squeezed States,” Phys. Rev. A 31, 3068 (1985). - C. Eichler, D. Bozyigit, C. Lang, M. Baur, L. Steffen, J. M. Fink, S. Filipp, and A. Wallraff, “Observation of Two-Mode Squeezing in the Microwave Frequency Domain,” Phys. Rev. Lett. 107, 113601 (2011). - E. P. Menzel et al., “Path Entanglement of Continuous-Variable Quantum Microwaves,” arXiv:1210.4413 (cond-mat.mes-hall).
<urn:uuid:a795fc97-3bd4-4bb0-91c6-fa29fa85a38a>
CC-MAIN-2014-15
http://physics.aps.org/articles/v5/120
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609530895.48/warc/CC-MAIN-20140416005210-00556-ip-10-147-4-33.ec2.internal.warc.gz
en
0.894909
1,808
4.0625
4
First Generation (1941-1956) World War gave rise to numerous developments and started off the computer age. Electronic Numerical Integrator and Computer (ENIAC) was produced by a partnership between University of Pennsylvania and the US government. It consisted of 18,000 vacuum tubes and 7000 resistors. It was developed by John Presper Eckert and John W. Mauchly and was a general purpose computer. "Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945 with a memory to hold both a stored program as well as data." Von Neumann's computer allowed for all the computer functions to be controlled by a single source. Then in 1951 came the Universal Automatic Computer (UNIVAC I), designed by Remington rand and collectively owned by US census bureau and General Electric. UNIVAC amazingly predicted the winner of 1952, presidential elections, Dwight D. Eisenhower. In first generation computers, the operating instructions or programs were specifically built for the task for which computer was manufactured. The Machine language was the only way to tell these machines to perform the operations. There was great difficulty to program these computers and more when there were some malfunctions. First Generation computers used Vacuum tubes and magnetic drums (for data storage). The IBM 650 Magnetic Drum Calculator Second Generation Computers (1956-1963) The invention of Transistors marked the start of the second generation. These transistors took place of the vacuum tubes used in the first generation computers. First large scale machines were made using these technologies to meet the requirements of atomic energy laboratories. One of the other benefits to the programming group was that the second generation replaced Machine language with the assembly language. Even though complex in itself Assembly language was much easier than the binary code. Second generation computers also started showing the characteristics of modern day computers with utilities such as printers, disk storage and operating systems. Many financial information was processed using these computers. In Second Generation computers, the instructions (program) could be stored inside the computer's memory. High-level languages such as COBOL (Common Business-Oriented Language) and FORTRAN (Formula Translator) were used, and they are still used for some applications nowadays. The IBM 7090 Console in the Columbia Computer Center machine room, 1966. Pictured: A group of particle physicists who discovered the violation of charge-conjugation invariance in interactions of intermediate strength: Charles Baltay and Lawrence Kirsch of Nevis Lab (back row); Juliet Lee-Franzini of SUNY Stony Brook and team leader Paulo Franzini of Nevis Lab [V1#7]. Photo: Columbia Computer Center Newsletter, V1#7, Aug 1966, Columbiana Archive. Although transistors were great deal of improvement over the vacuum tubes, they generated heat and damaged the sensitive areas of the computer. The Integrated Circuit(IC) was invented in 1958 by Jack Kilby. It combined electronic components onto a small silicon disc, made from quartz. More advancement made possible the fittings of even more components on a small chip or a semi conductor. Also in third generation computers, the operating systems allowed the machines to run many different applications. These applications were monitored and coordinated by the computer's memory. The IBM 360/91 Fourth Generation (1971-Present) Fourth Generation computers are the modern day computers. The Size started to go down with the improvement in the integrated circuits. Very Large Scale (VLSI) and Ultra Large scale (ULSI) ensured that millions of components could be fit into a small chip. It reduced the size and price of the computers at the same time increasing power, efficiency and reliability. "The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating all the components of a computer (central processing unit, memory, and input and output controls) on a minuscule chip." Due to the reduction of cost and the availability of the computers power at a small place allowed everyday user to benefit. First, the minicomputers which offered users different applications, most famous of these are the word processors and spreadsheets, which could be used by non-technical users. Video game systems like Atari 2600 generated the interest of general populace in the computers. In 1981, IBM introduced personal computers for home and office use. "The number of personal computers in use more than doubled from 2 million in 1981 to 5.5 million in 1982. Ten years later, 65 million PCs were being used." Computer size kept getting reduced during the years. It went down from Desktop to laptops to Palmtops. Mackintosh introduced Graphic User Interface in which the users don’t have to type instructions but could use Mouse for the purpose. The continued improvement allowed the networking of computers for the sharing of data. Local Area Networks (LAN) and Wide Area Network (WAN) were potential benefits, in that they could be implemented in corporations and everybody could share data over it. Soon the internet and World Wide Web appeared on the computer scene and fomented the Hi-Tech revolution of 90's. Fifth generation computers Fifth generation computers are mainly future computers. Of course some modern computers also belong to this generation. The aim of these computers is to develop devices that respond to natural language input and are capable of learning and self-organization. In these computers massive numbers of CPUs are used for more efficient performance. Voice recognition is a special feature in these computers. By using superconductors and parallel processing computer geeks are trying to make artificial intelligence a reality. Quantum computing, molecular and nanotechnology will change the face of computers in the coming years.Fifth generation computer.
<urn:uuid:05ecc68a-b107-4416-8de1-314595e1080f>
CC-MAIN-2014-15
http://abdullateefoyedeji.blogspot.com/2009/01/1st-2nd-3rd-4th-generation-computers.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609532480.36/warc/CC-MAIN-20140416005212-00566-ip-10-147-4-33.ec2.internal.warc.gz
en
0.946036
1,165
3.625
4
Today's quantum computers are no more than experiments. Researchers can string together a handful of quantum bits - seemingly magical bits that store a "1″ and "0″ at the same time - and these ephemeral creations can run relatively simple algorithms. But new research from IBM indicates that far more complex quantum computers aren't that far away. On Tuesday, IBM revealed that physicists at its Watson Research Center in Yorktown Heights, New York have made significant advances in the creation of "superconducting qubits," one of several research fields that could eventually lead to a quantum computer that's exponentially more powerful than today's classical computers. According to Matthias Steffen - who oversees Big Blue's experimental quantum computing group - he and his team have improved the performance of superconducting qubits by a factor of two to four. "What this means is that we can really start thinking about much larger systems," he tells Wired, "putting several of these quantum bits together and performing much larger error correction." David DiVincenzo - a professor at the Jülich Research Center‘s Institute of Quantum Information in western Germany and a former colleague if Steffen - agrees that IBM's new research is more than just a milestone. "These metrics have now - for the first time - attained the levels necessary to begin scaling up quantum computation to greater complexity," he says. "I think that we will soon see whole quantum computing modules, rather than just two- or three-qubit experiments." Whereas the computer on your desk obeys the laws of classical physics - the physics of the everyday world - a quantum computer taps the mind bending properties of quantum mechanics. In a classic computer, a transistor stores a single "bit" of information. If the transistor is "on," for instance, it holds a "1." If it's "off," it holds a "0." But with quantum computer, information is represented by a system that can an exist in two states at the same time, thanks to the superposition principle of quantum mechanics. Such a qubit can store a "0″ and "1″ simultaneously. Information might be stored in the spin of electron, for instance. An "up" spin represents a "1." A "down" spin represent a "0." And at any given time, this spin can be both up and down. "The concept has almost no analog in the classical world," Steffan says. "It would be almost like me saying I could be over here and over there where you are at the same time." If you then put two qubits together, they can hold four values at once: 00, 01, 10, and 11. And as you add more and more qubits, you can build a system that's exponentially more powerful than a classic computer. You could, say, crack the world's strongest encryption algorithms in a matter of seconds. As IBM points out, a 250-qubit quantum computer would contain more bits that there are particles in the universe. But building a quantum computer isn't easy. The idea was first proposed in the mid-80s, and we're still at the experimental stage. The trouble is that quantum systems so easily "decohere," dropping from two simultaneous states into just a single state. Your quantum bit can very quickly become an ordinary classical bit. Researchers such as Matthias Steffen and David DiVincenzo aim to build systems that can solve this decoherence problem. At IBM, Steffen and his team base their research on a phenomenon known as superconductivity. In essence, if you cool certain substances to very low temperatures, they exhibit zero electrical resistance. Steffen describes this as something akin to a loop where current flows in two directions at the same time. A clockwise current represents a "1," and counter clockwise represents a "0." IBM's qubits are built atop a silicon substrate using aluminum and niobium superconductors. Essentially, two superconducting electrodes sit between an insulator - or Josephson junction - of aluminum oxide. The trick is keep this quantum system from decohering for as long as possible. If you can keep the qubits in a quantum state for long enough, Steffen says, you can build the error correction schemes you need to operate a reliable quantum computer. The threshold is about 10 to 100 microseconds, and according to Steffen, his team has now reached this point with a "three-dimensional" qubit based on a method originally introduced by researchers at Yale University. Ten years ago, decoherence times were closer to a nanosecond. In other words, over the last ten years, researchers have improved the performance of superconducting qubits by a factor of more than 10,000. IBM's team has also built a "controlled NOT gate" with traditional two-dimensional qubits, meaning they can flip the state of one qubit depending on the state of the other. This too is essential to building a practical quantum computer, and Steffen says his team can successfully flip that state 95 percent of the time - thanks to a decoherence time of about 10 microseconds. "So, not just is our single device performance remarkably good," he explains, "our demonstration of a two-qubit device - an elementary logic gate - is also good enough to get at least close to the threshold needed for a practical quantum computer. We're not quite there yet, but we're getting there." The result is that the researchers are now ready to build a system that spans several qubits. "The next bottleneck is now how to make these devices betters. The bottleneck is how to put five or ten of these on a chip," Steffen says. "The device performance is good enough to do that right now. The question is just: ‘How do you put it all together?'" Wired.com has been expanding the hive mind with technology, science and geek culture news since 1995.
<urn:uuid:d5cf484e-32de-4b35-bac8-fca387f5c0c1>
CC-MAIN-2014-15
http://gizmodo.com/5888878/ibm-busts-record-for-superconducting-quantum-computer?tag=guts
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609521512.15/warc/CC-MAIN-20140416005201-00190-ip-10-147-4-33.ec2.internal.warc.gz
en
0.957715
1,226
3.625
4
Quantum mechanics is, mathematically, quite simple. But it has implications that require people to think differently about the world. One particularly hard-to-grasp idea is that, on the surface, some knowledge precludes obtaining other knowledge. This is a consequence of how we obtain it. In an innovative experiment, researchers from Austria have demonstrated how to recover that lost information. Before you get the wrong impression, though, this is completely in agreement with the rules of quantum mechanics—it is simply a very clever way of playing with quantum states. Before looking at the experiment, note what makes this interesting. The keywords that turn up in these sorts of articles are superposition states and measurement. Imagine that we have 100 electrons, sitting in a magnetic field. Their individual magnetic fields are all, thanks to the applied field, pointing in the same direction. Now, we turn on a microwave for a specific period of time. Chosen correctly, all 100 electrons flip their fields so that they point in exactly the opposite direction. If we make a measurement, all electrons report the same spin. If we cut the time of the microwave pulse in half, however, something very strange happens: all the electrons end up with their fields pointing in both directions at once. This is called a superposition state. Once we make a measurement, though, we find half the electrons have their fields pointing in one direction and half have their fields pointing in the opposite direction—the superposition state vanishes. You might immediately think it was never there in the first place: we simply put in half the energy, so only half the electrons responded. But this is incorrect. We know that we get superposition states because we've looked. We can create a situation where, if the electrons were not in superposition state, we observe one result, and if they're in a superposition state, we observe something different. Even though we know it is a superposition state, every measurement on a single electron reports that its field is either pointing with the applied field or against the applied field. This behavior tells us that, when we make a measurement, we destroy the superposition state and place the electron in a single pure state. We can't tell anything about the superposition state other than that it included the state that we measured. Why should we care? This property of quantum mechanics has made quantum computing a little bit more difficult. If everything goes well, at the end of a calculation, a qubit (quantum bit) will be in a superposition of the right answer and the wrong answer. Although the probability of the right answer should be much higher than that of the wrong answer, there is always a chance of getting the wrong answer. In very fast quantum computers, we would just run the calculation a few times and take the most frequent answer as the correct answer. But what if your computer is slow? The ideal situation is to be able to take a measurement, then reconstruct the original superposition state so that repeated measurements could be used (ensuring that the most probable state can be determined). However, for a single particle, that is impossible. To overcome this problem, researchers have done the obvious: they spread their qubit over three particles. Even with three particles, however, measuring all three then taking a majority vote on the answer may not be good enough. So the researchers decided to be clever. In their scheme, the qubit is encoded between two states of an ionized calcium atom. To provide redundancy, the qubit is then entangled with two other qubits. (You can think of this as creating two mirror images of the quantum state of the qubit in two other particles. This isn't a technically correct description, but it should help you get the point). Now, we have our three qubits, each of which encodes a single quantum bit of information. But, unlike our electron in the magnetic field where there are only two possible states, the calcium ion has many, many states available to it. The researchers make use of a total of four states. One state corresponds to a logical zero, while a second corresponds to a logical one. (I'll call the other two states the measurement state and the hidden state.) One laser connects the measurement state to the logical one state, while a second laser connects the logical one state to the hidden state. To measure a state, we turn on the first laser. The qubit falls into either the logical one or logical zero, based on the probabilities of the superposition state. If it ends up in logical one, the laser light is scattered by the ion and is detected. Hence a pulse on the photodetector indicates a logic one, while the absence of light signals a logical zero. That is the measurement process. The second laser simply changes the definition of the qubit. Initially, a qubit is a superposition of the logical one and logical zero states. After the laser pulse, the qubit is a superposition between the hidden state and the logical zero state. In this case, the qubit cannot be evaluated by the measurement process. The researchers take advantage of this by placing two of the three qubits into the hidden state, then measuring the remaining qubit to get an answer (logical one or zero). Then, after the measurement process, the hidden qubits are returned to their original states (a superposition of logical one and zero) and re-entangled with the original qubit. In doing so, the qubit is placed back in its original superposition state. By repeating this process, the researchers can measure the state as many times as required to ensure that they know which logical state was the most frequent. Indeed, through multiple measurements, the researcher can obtain the relative probabilities of logical one and logical zero of the original superposition state. In terms of advances for practical quantum computers, this may not mean a huge amount. However, it demonstrates a technique that will be critical for a working quantum computer. So in this sense, it is a very important step. But the issue may be with the implementation; the work was done with trapped ions, and it's hard to believe that trapped ions, floating in a vacuum, are the future of quantum computing. Physical Review Letters, 2013, DOI: 10.1103/PhysRevLett.110.070403 Listing image by David Singleton
<urn:uuid:d1d43d81-1bd4-4e41-b3dd-36e0bfd03057>
CC-MAIN-2014-15
http://arstechnica.com/science/2013/03/quantum-computer-gets-an-undo-button/?comments=1
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609535535.6/warc/CC-MAIN-20140416005215-00255-ip-10-147-4-33.ec2.internal.warc.gz
en
0.943375
1,306
3.890625
4
Climate events drive a high-arctic vertebrate community into synchrony Climate change is known to affect the population dynamics of single species, such as reindeer or caribou, but the effect of climate at the community level has been much more difficult to document. Now, a group of Norwegian scientists has found that extreme climate events cause synchronized population fluctuations among all vertebrate species in a relatively simple high arctic community. These findings may be a bellwether of the radical changes in ecosystem stability that could result from anticipated future increases in extreme events. The findings are published in the 18 January issue of Science. The Norwegian scientists, with lead authors from the Centre for Conservation Biology at the Norwegian University of Science and Technology (NTNU), wanted to know how climate and weather events influenced an overwintering vertebrate community on the high arctic island of Spitsbergen, Svalbard, at 78 degrees N latitude. They chose this simple ecosystem because it is composed of just three herbivores in the winter -- the wild Svalbard reindeer (Rangifer tarandus platyrhynchus), the Svalbard rock ptarmigan (Lagopus muta hyperborea), and the sibling vole (Microtus levis), and one shared consumer, the arctic fox (Vulpes lagopus). The community's population fluctuations were mainly driven by rain-on-snow events, the researchers found. Rain-on-snow is an extreme climatic occurrence that causes icing on the deep-frozen arctic tundra. The ice keeps reindeer from grazing on their winter pastures and also reduces food accessibility for the rock ptarmigan and sibling vole populations, causing extensive simultaneous population crashes in all three species in the winter and spring after the extreme weather. However, the arctic fox, which mainly relies on reindeer carcasses as its terrestrial winter food source, didn't see a decline in its population size until a year after the herbivore die-offs. Even though the synchronized die-offs decrease the number of live prey available for foxes to eat, the high number of reindeer carcasses generates an abundance of food for foxes during icy winters and the subsequent spring and summer. This leads to high fox reproduction. But almost no reindeer carcasses will be available during the following winter, mainly because those reindeer that survived the previous winter are more robust and also subject to reduced competition for food resources. At the same time, none of the other herbivores is able to recover in the summer after the icing. The net result is low fox reproduction and a strong reduction in the arctic fox population size one year after the herbivore die-offs. "We have known for a long time that climate can synchronize populations of the same species, but these findings suggest that climate and particularly extreme weather events may also synchronize entire communities of species," says lead author Brage Bremset Hansen, from NTNU's Centre for Conservation Biology. "Svalbard's relatively simple ecosystem, which lacks specialist predators, combined with large weather fluctuations from year to year and strong climate signals in the population dynamics of herbivores, are the likely explanations for how such clear climate effects can be observed at the ecosystem level." In other, more complex systems, he says, community-level effects of climate can be present but are likely masked by other factors that tend to obscure the synchronizing effects of climate, which thus complicates the picture. Extreme rain-on-snow events are rare in most of the Arctic compared with Svalbard, where the climate is oceanic and mild for the latitude. However, because the frequency of such rain-on-snow events leading to icing is closely linked to a rapidly warming arctic climate, the authors warn that changes in winter climate and extreme events may have important implications for ecosystem functioning and stability in the circumpolar Arctic in the future. "Previous studies have shown that rain-on-snow and icing can also cause vegetation damage and reduce survival of soil microbiota," says Hansen. "But more importantly, we suspect that the strong effects of icing on the overwintering vertebrate community have the potential to indirectly influence other species and cascade throughout the food web. The die-offs among resident herbivores shape predator abundance, which could in turn affect the migratory prey that reside in the area in the summer, such as sea birds and barnacle geese." - How Extreme Weather Links The Fates Of Four Adorable Arctic Speciesfrom PopSciThu, 17 Jan 2013, 17:30:58 EST - Extreme weather events a potent force for arctic overwintering populationsfrom PhysorgThu, 17 Jan 2013, 14:01:26 EST Latest Science NewsletterGet the latest and most popular science news articles of the week in your Inbox! It's free! Learn more about Check out our next project, Biology.Net From other science news sites Popular science news articles - Hearing quality restored with bionic ear technology used for gene therapy - NASA satellites show drought may take toll on Congo rainforest - Superconducting qubit array points the way to quantum computers - Scientists identify source of mysterious sound in the Southern Ocean - From liability to viability: Genes on the Y chromosome prove essential for male survival - Criticism of violent video games has decreased as technology has improved, gamers age - Hummingbirds' 22-million-year-old history of remarkable change is far from complete - Research clarifies health costs of air pollution from agriculture - Ancient 'spider' images reveal eye-opening secrets - New research finds 'geologic clock' that helps determine moon's age
<urn:uuid:a7ce87b0-c666-40c7-8482-355162c4a331>
CC-MAIN-2014-15
http://esciencenews.com/articles/2013/01/17/climate.events.drive.a.high.arctic.vertebrate.community.synchrony
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206647.11/warc/CC-MAIN-20140423032006-00362-ip-10-147-4-33.ec2.internal.warc.gz
en
0.916215
1,187
3.71875
4
Diamonds have long been available in pairs—say, mounted in a nice set of earrings. But physicists have now taken that pairing to a new level, linking two diamonds on the quantum level. A group of researchers report in the December 2 issue of Science that they managed to entangle the quantum states of two diamonds separated by 15 centimeters. Quantum entanglement is a phenomenon by which two or more objects share an unseen link bridging the space between them—a hypothetical pair of entangled dice, for instance, would always land on matching numbers, even if they were rolled in different places simultaneously. But that link is fragile, and it can be disrupted by any number of outside influences. For that reason entanglement experiments on physical systems usually take place in highly controlled laboratory setups—entangling, say, a pair of isolated atoms cooled to nearly absolute zero. In the new study, researchers from the University of Oxford, the National Research Council of Canada and the National University of Singapore (NUS) showed that entanglement can also be achieved in macroscopic objects at room temperature. "What we have done is demonstrate that it's possible with more standard, everyday objects—if diamond can be considered an everyday object," says study co-author Ian Walmsley, an experimental physicist at Oxford. "It's possible to put them into these quantum states that you often associate with these engineered objects, if you like—these closely managed objects." To entangle relatively large objects, Walmsley and his colleagues harnessed a collective property of diamonds: the vibrational state of their crystal lattices. By targeting a diamond with an optical pulse, the researchers can induce a vibration in the diamond, creating an excitation called a phonon—a quantum of vibrational energy. Researchers can tell when a diamond contains a phonon by checking the light of the pulse as it exits. Because the pulse has deposited a tiny bit of its energy in the crystal, one of the outbound photons is of lower energy, and hence longer wavelength, than the photons of the incoming pulse. Walmsley and his colleagues set up an experiment that would attempt to entangle two different diamonds using phonons. They used two squares of synthetically produced diamond, each three millimeters across. A laser pulse, bisected by a beam splitter, passes through the diamonds; any photons that scatter off of the diamond to generate a phonon are funneled into a photon detector. One such photon reaching the detector signals the presence of a phonon in the diamonds. But because of the experimental design, there is no way of knowing which diamond is vibrating. "We know that somewhere in that apparatus, there is one phonon," Walmsley says. "But we cannot tell, even in principle, whether that came from the left-hand diamond or the right-hand diamond." In quantum-mechanical terms, in fact, the phonon is not confined to either diamond. Instead the two diamonds enter an entangled state in which they share one phonon between them. To verify the presence of entanglement, the researchers carried out a test to check that the diamonds were not acting independently. In the absence of entanglement, after all, half the laser pulses could set the left-hand diamond vibrating and the other half could act on the right-hand diamond, with no quantum correlation between the two objects. If that were the case, then the phonon would be fully confined to one diamond. If, on the other hand, the phonon were indeed shared by the two entangled diamonds, then any detectable effect of the phonon could bear the imprint of both objects. So the researchers fired a second optical pulse into the diamonds, with the intent of de-exciting the vibration and producing a signal photon that indicates that the phonon has been removed from the system. The phonon's vibrational energy gives the optical pulse a boost, producing a photon with higher energy, or shorter wavelength, than the incoming photons and eliminating the phonon in the process. Once again, there is no way of knowing which diamond produced the photon, because the paths leading from each diamond to the detectors are merged, so there is no way of knowing where the phonon was. But the researchers found that each of the photon paths leading from the diamonds to the detectors had an interfering effect on the other—adjusting how the two paths were joined affected the photon counts in the detectors. In essence, a single photon reaching the detectors carried information about both paths. So it cannot be said to have traveled down one path from one diamond: the photon, as with the vibrational phonon that produced it, came from both diamonds. After running the experiment over and over again to gather statistically significant results, the researchers concluded with confidence that entanglement had indeed been achieved. "We can't be 100 percent certain that they're entangled, but our statistical analysis shows that we're 98 percent confident in that, and we think that's a pretty good outcome," Walmsley says. The catch to using phonons for macroscopic entanglement is that they do not last long—only seven picoseconds, or seven trillionths of a second, in diamond. So the experimenters had to rely on extremely fast optical pulses to carry out their experiment, creating entangled states with phonons and then damping the phonons with the second pulse to test that entanglement just 0.35 picoseconds later. Because of this brevity, such entanglement schemes may not take over for more established techniques using photons or single atoms, but Walmsley hopes that researchers will consider the possibilities of using fairly ordinary, room-temperature materials in quantum technologies. "I think it gives a new scenario and a new instantiation of something that helps point in that direction," he says. Indeed, the new study is just the latest to show how quantum mechanics applies in real-world, macroscopic systems. Oxford and NUS physicist Vlatko Vedral, who was not involved in the new research, says it "beautifully illustrates" the point of Austrian physicist Erwin Schrödinger's famous thought experiment in which a hypothetical cat is simultaneously alive and dead. "It can't be that entanglement exists at the micro level (say of photons) but not at the macro level (say of diamonds)," because those worlds interact, Vedral wrote in an email. "Schrödinger used atoms instead of photons and cats instead of diamonds, but the point is the same."
<urn:uuid:f295d71d-ccf9-4ce8-af35-386cbee0edd6>
CC-MAIN-2014-15
http://www.scientificamerican.com/article/room-temperature-entanglement/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206647.11/warc/CC-MAIN-20140423032006-00362-ip-10-147-4-33.ec2.internal.warc.gz
en
0.95356
1,340
3.609375
4
Action at a distance In physics, action at a distance is the nonlocal interaction of objects that are separated in space. This term was used most often in the context of early theories of gravity and electromagnetism to describe how an object responds to the influence of distant objects. More generally "action at a distance" describes the failure of early atomistic and mechanistic theories which sought to reduce all physical interaction to collision. The exploration and resolution of this problematic phenomenon led to significant developments in physics, from the concept of a field, to descriptions of quantum entanglement and the mediator particles of the standard model. Electricity and magnetism Efforts to account for action at a distance in the theory of electromagnetism led to the development of the concept of a field which mediated interactions between currents and charges across empty space. According to field theory we account for the Coulomb (electrostatic) interaction between charged particles through the fact that charges produce around themselves an electric field, which can be felt by other charges as a force. The concept of the field was elevated to fundamental importance in Maxwell's equations, which used the field to elegantly account for all electromagnetic interactions, as well as light (which, until then, had been a completely unrelated phenomenon). In Maxwell's theory, the field is its own physical entity, carrying momenta and energy across space, and action at a distance is only the apparent effect of local interactions of charges with their surrounding field. Electrodynamics can be described without fields (in Minkowski space) as the direct interaction of particles with lightlike separation vectors. This results in the Fokker-Tetrode-Schwarzschild action integral. This kind of electrodynamic theory is often called "direct interaction" to distinguish it from field theories where action at a distance is mediated by a localized field (localized in the sense that its dynamics are determined by the nearby field parameters). This description of electrodynamics, in contrast with Maxwell's theory, explains apparent action at a distance not by postulating a mediating entity (the field) but by appealing to the natural geometry of special relativity. Direct interaction electrodynamics is explicitly symmetrical in time, and avoids the infinite energy predicted in the field immediately surrounding point particles. Feynman and Wheeler have shown that it can account for radiation and radiative damping (which had been considered strong evidence for the independent existence of the field). However various proofs, beginning with that of Dirac have shown that direct interaction theories (under reasonable assumptions) do not admit Lagrangian or Hamiltonian formulations (these are the so-called No Interaction Theorems). Also significant is the measurement and theoretical description of the Lamb shift which strongly suggests that charged particles interact with their own field. Because of these difficulties, and others, it is fields that have been elevated to the fundamental operators in QFT and modern physics has largely abandoned direct interaction theory. Newton's theory of gravity offered no prospect of identifying any mediator of gravitational interaction. His theory assumed that gravitation acts instantaneously, regardless of distance. Kepler's observations gave strong evidence that in planetary motion angular momentum is conserved. (The mathematical proof is only valid in the case of a Euclidean geometry.) Gravity is also known as a force of attraction between two objects because of their mass. From a Newtonian perspective, action at a distance can be regarded as: "a phenomenon in which a change in intrinsic properties of one system induces a change in the intrinsic properties of a distant system, independently of the influence of any other systems on the distant system, and without there being a process that carries this influence contiguously in space and time" (Berkovitz 2008). A related question, raised by Ernst Mach, was how rotating bodies know how much to bulge at the equator. This, it seems, requires an action-at-a-distance from distant matter, informing the rotating object about the state of the universe. Einstein coined the term Mach's principle for this question. It is inconceivable that inanimate Matter should, without the Mediation of something else, which is not material, operate upon, and affect other matter without mutual Contact…That Gravity should be innate, inherent and essential to Matter, so that one body may act upon another at a distance thro' a Vacuum, without the Mediation of any thing else, by and through which their Action and Force may be conveyed from one to another, is to me so great an Absurdity that I believe no Man who has in philosophical Matters a competent Faculty of thinking can ever fall into it. Gravity must be caused by an Agent acting constantly according to certain laws; but whether this Agent be material or immaterial, I have left to the Consideration of my readers.—Isaac Newton, Letters to Bentley, 1692/3 According to Albert Einstein's theory of special relativity, instantaneous action at a distance was seen to violate the relativistic upper limit on speed of propagation of information. If one of the interacting objects were to suddenly be displaced from its position, the other object would feel its influence instantaneously, meaning information had been transmitted faster than the speed of light. One of the conditions that a relativistic theory of gravitation must meet is to be mediated with a speed that does not exceed c, the speed of light in a vacuum. It could be seen from the previous success of electrodynamics that the relativistic theory of gravitation would have to use the concept of a field or something similar. This problem has been resolved by Einstein's theory of general relativity in which gravitational interaction is mediated by deformation of space-time geometry. Matter warps the geometry of space-time and these effects are, as with electric and magnetic fields, propagated at the speed of light. Thus, in the presence of matter, space-time becomes non-Euclidean, resolving the apparent conflict between Newton's proof of the conservation of angular momentum and Einstein's theory of special relativity. Mach's question regarding the bulging of rotating bodies is resolved because local space-time geometry is informing a rotating body about the rest of the universe. In Newton's theory of motion, space acts on objects, but is not acted upon. In Einstein's theory of motion, matter acts upon space-time geometry, deforming it, and space-time geometry acts upon matter. Since the early 20th century, quantum mechanics has posed new challenges for the view that physical processes should obey locality. Whether quantum entanglement counts as action-at-a-distance hinges on the nature of the wave function and decoherence, issues over which there is still considerable debate among scientists and philosophers. One important line of debate originated with Einstein, who challenged the idea that quantum mechanics offers a complete description of reality, along with Boris Podolsky and Nathan Rosen. They proposed a thought experiment involving an entangled pair of observables with non-commuting operators (e.g. position and momentum). This thought experiment, which came to be known as the EPR paradox, hinges on the principle of locality. A common presentation of the paradox is as follows: two particles interact and fly off in opposite directions. Even when the particles are so far apart that any classical interaction would be impossible (see principle of locality), a measurement of one particle nonetheless determines the corresponding result of a measurement of the other. After the EPR paper, several scientists such as de Broglie studied local hidden variables theories. In the 1960s John Bell derived an inequality that indicated a testable difference between the predictions of quantum mechanics and local hidden variables theories. To date, all experiments testing Bell-type inequalities in situations analogous to the EPR thought experiment have results consistent with the predictions of quantum mechanics, suggesting that local hidden variables theories can be ruled out. Whether or not this is interpreted as evidence for nonlocality depends on one's interpretation of quantum mechanics. Non-standard interpretations of quantum mechanics vary in their response to the EPR-type experiments. The Bohm interpretation gives an explanation based on nonlocal hidden variables for the correlations seen in entanglement. Many advocates of the many-worlds interpretation argue that it can explain these correlations in a way that does not require a violation of locality, by allowing measurements to have non-unique outcomes. - Quantum pseudo-telepathy - Quantum teleportation - Wheeler–Feynman absorber theory - Dynamism (metaphysics) - Hesse, Mary B. (December 1955). "Action at a Distance in Classical Physics". Retrieved 2012-11-04. - Barut, A. O. "Electrodynamics and Classical Theory of Fields and Particles" - Berkovitz, Joseph (2008). "Action at a Distance in Quantum Mechanics". In Edward N. Zalta. The Stanford Encyclopedia of Philosophy (Winter 2008 ed.). - Einstein, A.; Podolsky, B.; Rosen, N. (1935). "Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?". Physical Review 47 (10): 777–780. Bibcode:1935PhRv...47..777E. doi:10.1103/PhysRev.47.777. - Bell, J.S. (1966). On the problem of hidden variables in quantum mechanics. Reviews of Modern Physics. 38(3). 447-452. - Rubin (2001). "Locality in the Everett Interpretation of Heisenberg-Picture Quantum Mechanics". Found. Phys. Lett. 14 (4): 301–322. arXiv:quant-ph/0103079. doi:10.1023/A:1012357515678.
<urn:uuid:a970e6c7-f6ea-4735-8454-6f5ab98781ba>
CC-MAIN-2014-15
http://en.wikipedia.org/wiki/Action_at_a_distance_(physics)
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223207046.13/warc/CC-MAIN-20140423032007-00370-ip-10-147-4-33.ec2.internal.warc.gz
en
0.922946
1,997
3.609375
4
The first neotropical rainforest was home of the Titanoboa Smithsonian researchers working in Colombia's Cerrejón coal mine have unearthed the first megafossil evidence of a neotropical rainforest. Titanoboa, the world's biggest snake, lived in this forest 58 million years ago at temperatures 3-5 C warmer than in rainforests today, indicating that rainforests flourished during warm periods. "Modern neotropical rainforests, with their palms and spectacular flowering-plant diversity, seem to have come into existence in the Paleocene epoch, shortly after the extinction of the dinosaurs 65 million years ago," said Carlos Jaramillo, staff scientist at the Smithsonian Tropical Research Institute. "Pollen evidence tells us that forests before the mass extinction were quite different from our fossil rainforest at Cerrejón. We find new plant families, large, smooth-margined leaves and a three-tiered structure of forest floor, understory shrubs and high canopy." Historically, good rock exposures and concentrated efforts by paleontologists to understand the evolution of neotropical rainforests—one of the most awe-inspiring assemblages of plant and animal life on the planet—have been lacking. "The Cerrejón mining operation is the first clear window we have to see back in time to the Paleocene, when the neotropical rainforest was first developing," said Scott Wing, a paleontologist from the Smithsonian's National Museum of Natural History. Some of the more than 2,000 fossil leaves, including the compound leaves and pods of plants in the bean family and leaves of the hibiscus family are among the oldest, reliable evidence of these groups. This was the first time that the plant families Araceae, Arecaceae, Fabaceae, Lauraceae, Malvaceae and Menispermaceae, which are still among the most common neotropical rainforest families, all occurred together. Many newcomers to modern rainforests remark that the leaves all look the same, a reasonable observation given that most have smooth margins and long "drip-tips" thought to prevent water from accumulating on the leaf surface. S. Joseph Wright, senior scientist at STRI, has noted that all of the areas in the world today with average yearly temperatures greater than 28 C are too dry to support tropical rainforests. If tropical temperatures increase by 3 C by the end of this century as predicted in the 2007 report of the Intergovernmental Panel on Climate Change, "We're going to have a novel climate where it is very hot and very wet. How tropical forest species will respond to this novel climate, we don't know," said Wright. Based on leaf shape and the size of the cold-blooded Titanoboa, Cerrejón rainforest existed at temperatures up to 30-32 C and rainfall averages exceeded 2500 mm per year. But Titanoboa's rainforest was not as diverse as modern rainforests. Comparison of the diversity of this fossil flora to modern Amazon forest diversity and to the diversity of pollen from other Paleocene rainforests revealed that there are fewer species at Cerrejón than one would expect. Insect-feeding damage on leaves indicated that they could have been eaten by herbivores with a very general diet rather than insects specific to certain host plants. "We were very surprised by the low plant diversity of this rainforest. Either we are looking at a new type of plant community that still hadn't had time to diversify, or this forest was still recovering from the events that caused the mass extinction 65 million years ago," said Wing. "Our next steps are to collect and analyze more sites of the same age from elsewhere in Colombia to see if the patterns at Cerrejón hold, and study additional sites that bracket the Cretaceous mass extinction, in order to really understand how the phenomenal interactions that typify modern rainforests came to be." Articles on the same topic - Plant fossils give first real picture of earliest Neotropical rainforestsThu, 15 Oct 2009, 16:26:14 EDT - Plant fossils give first real picture of earliest Neotropical rainforestsfrom Science BlogThu, 15 Oct 2009, 16:49:31 EDT - Evidence found of neotropical rainforestfrom UPITue, 13 Oct 2009, 9:42:10 EDT - The first neotropical rainforest was home of the Titanoboafrom Science CentricTue, 13 Oct 2009, 5:56:05 EDT - First Neotropical Rainforest Was Home Of The Titanoboa -- World's Biggest Snakefrom Science DailyTue, 13 Oct 2009, 0:07:05 EDT - The first neotropical rainforest was home of the Titanoboafrom PhysorgMon, 12 Oct 2009, 16:07:19 EDT Latest Science NewsletterGet the latest and most popular science news articles of the week in your Inbox! It's free! Learn more about Check out our next project, Biology.Net From other science news sites Popular science news articles - Hearing quality restored with bionic ear technology used for gene therapy - NASA satellites show drought may take toll on Congo rainforest - Superconducting qubit array points the way to quantum computers - Scientists identify source of mysterious sound in the Southern Ocean - From liability to viability: Genes on the Y chromosome prove essential for male survival - Criticism of violent video games has decreased as technology has improved, gamers age - Hummingbirds' 22-million-year-old history of remarkable change is far from complete - Research clarifies health costs of air pollution from agriculture - Ancient 'spider' images reveal eye-opening secrets - New research finds 'geologic clock' that helps determine moon's age
<urn:uuid:44e58101-50c0-4d3f-97ee-92d1c0c6cf7f>
CC-MAIN-2014-15
http://esciencenews.com/articles/2009/10/12/the.first.neotropical.rainforest.was.home.titanoboa
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206647.11/warc/CC-MAIN-20140423032006-00362-ip-10-147-4-33.ec2.internal.warc.gz
en
0.933444
1,197
3.953125
4
||This article's introduction may be too long for the overall article length. (August 2010)| Reversible computing is a model of computing where the computational process to some extent is reversible, i.e., time-invertible. In a computational model that uses transitions from one state of the abstract machine to another, a necessary condition for reversibility is that the relation of the mapping from states to their successors must be one-to-one. Reversible computing is generally considered an unconventional form of computing. There are two major, closely related, types of reversibility that are of particular interest for this purpose: physical reversibility and logical reversibility. A process is said to be physically reversible if it results in no increase in physical entropy; it is isentropic. These circuits are also referred to as charge recovery logic, adiabatic circuits, or adiabatic computing. Although in practice no nonstationary physical process can be exactly physically reversible or isentropic, there is no known limit to the closeness with which we can approach perfect reversibility, in systems that are sufficiently well-isolated from interactions with unknown external environments, when the laws of physics describing the system's evolution are precisely known. Probably the largest motivation for the study of technologies aimed at actually implementing reversible computing is that they offer what is predicted to be the only potential way to improve the energy efficiency of computers beyond the fundamental von Neumann-Landauer limit of kT ln(2) energy dissipated per irreversible bit operation. As was first argued by Rolf Landauer of IBM, in order for a computational process to be physically reversible, it must also be logically reversible. Landauer's principle is the loosely formulated notion that the erasure of n bits of information must always incur a cost of nk ln(2) in thermodynamic entropy. A discrete, deterministic computational process is said to be logically reversible if the transition function that maps old computational states to new ones is a one-to-one function; i.e. the output logical states uniquely defines the input logical states of the computational operation. For computational processes that are nondeterministic (in the sense of being probabilistic or random), the relation between old and new states is not a single-valued function, and the requirement needed to obtain physical reversibility becomes a slightly weaker condition, namely that the size of a given ensemble of possible initial computational states does not decrease, on average, as the computation proceeds forwards. The reversibility of physics and reversible computing Landauer's principle (and indeed, the second law of thermodynamics itself) can also be understood to be a direct logical consequence of the underlying reversibility of physics, as is reflected in the general Hamiltonian formulation of mechanics, and in the unitary time-evolution operator of quantum mechanics more specifically. In the context of reversible physics, the phenomenon of entropy increase (and the observed arrow of time) can be understood to be consequences of the fact that our evolved predictive capabilities are rather limited, and cannot keep perfect track of the exact reversible evolution of complex physical systems, especially since these systems are never perfectly isolated from an unknown external environment, and even the laws of physics themselves are still not known with complete precision. Thus, we (and physical observers generally) always accumulate some uncertainty about the state of physical systems, even if the system's true underlying dynamics is a perfectly reversible one that is subject to no entropy increase if viewed from a hypothetical omniscient perspective in which the dynamical laws are precisely known. The implementation of reversible computing thus amounts to learning how to characterize and control the physical dynamics of mechanisms to carry out desired computational operations so precisely that we can accumulate a negligible total amount of uncertainty regarding the complete physical state of the mechanism, per each logic operation that is performed. In other words, we would need to precisely track the state of the active energy that is involved in carrying out computational operations within the machine, and design the machine in such a way that the majority of this energy is recovered in an organized form that can be reused for subsequent operations, rather than being permitted to dissipate into the form of heat. Although achieving this goal presents a significant challenge for the design, manufacturing, and characterization of ultra-precise new physical mechanisms for computing, there is at present no fundamental reason to think that this goal cannot eventually be accomplished, allowing us to someday build computers that generate much less than 1 bit's worth of physical entropy (and dissipate much less than kT ln 2 energy to heat) for each useful logical operation that they carry out internally. The motivation behind much of the research that has been done in reversible computing was the first seminal paper on the topic, which was published by Charles H. Bennett of IBM research in 1973. Today, the field has a substantial body of academic literature behind it. A wide variety of reversible device concepts, logic gates, electronic circuits, processor architectures, programming languages, and application algorithms have been designed and analyzed by physicists, electrical engineers, and computer scientists. This field of research awaits the detailed development of a high-quality, cost-effective, nearly reversible logic device technology, one that includes highly energy-efficient clocking and synchronization mechanisms. This sort of solid engineering progress will be needed before the large body of theoretical research on reversible computing can find practical application in enabling real computer technology to circumvent the various near-term barriers to its energy efficiency, including the von Neumann-Landauer bound. This may only be circumvented by the use of logically reversible computing, due to the Second Law of Thermodynamics. To implement reversible computation, estimate its cost, and to judge its limits, it is formalized it in terms of gate-level circuits. For example, the inverter (logic gate) (NOT) gate is reversible because it can be undone. The exclusive or (XOR) gate is irreversible because its inputs cannot be unambiguously reconstructed from an output value. However, a reversible version of the XOR gate—the controlled NOT gate (CNOT)—can be defined by preserving one of the inputs. The three-input variant of the CNOT gate is called the Toffoli gate. It preserves two of its inputs a,b and replaces the third c by . With , this gives the AND function, and with this gives the NOT function. Thus, the Toffoli gate is universal and can implement any reversible Boolean function (given enough zero-initialized ancillary bits). More generally, reversible gates have the same number of inputs and outputs. A reversible circuit connects reversible gates without fanouts and loops. Therefore, such circuits contain equal numbers of input and output wires, each going through an entire circuit. Reversible logic circuits have been first motivated in the 1960s by theoretical considerations of zero-energy computation as well as practical improvement of bit-manipulation transforms in cryptography and computer graphics. Since the 1980s, reversible circuits have attracted interest as components of quantum algorithms, and more recently in photonic and nano-computing technologies where some switching devices offer no signal gain. - Reverse computation - Reversible dynamics - Maximum entropy thermodynamics, on the uncertainty interpretation of the second law of thermodynamics - Reversible process - Toffoli gate - Fredkin gate - Quantum computing - Billiard-ball computer - Three-input universal logic gate - Reversible cellular automaton - J. von Neumann, Theory of Self-Reproducing Automata, Univ. of Illinois Press, 1966. - R. Landauer, "Irreversibility and heat generation in the computing process," IBM Journal of Research and Development, vol. 5, pp. 183-191, 1961. - C. H. Bennett, "Logical reversibility of computation," IBM Journal of Research and Development, vol. 17, no. 6, pp. 525-532, 1973. - C. H. Bennett, "The Thermodynamics of Computation -- A Review," International Journal of Theoretical Physics, vol. 21, no. 12, pp. 905-940, 1982. - Rolf Drechsler, Robert Wille. From Truth Tables to Programming Languages: Progress in the Design of Reversible Circuits. International Symposium on Multiple-Valued Logic, 2011. http://www.informatik.uni-bremen.de/agra/doc/konf/11_ismvl_reversible_circuit_design_tutorial.pdf - Mehdi Saeedi, Igor L. Markov, Synthesis and Optimization of Reversible Circuits - A Survey, ACM Computing Surveys, 2012. http://arxiv.org/abs/1110.2574 - Rolf Drechsler and Robert Wille. Reversible Circuits: Recent Accomplishments and Future Challenges for an Emerging Technology. International Symposium on VLSI Design and Test, 2012. http://www.informatik.uni-bremen.de/agra/doc/konf/2012_vdat_reversible_circuits_accompl_chall.pdf Review of later theoretical work: P.M.B. Vitanyi, Time, space, and energy in reversible computing, Proceedings of the 2nd ACM conference on Computing frontiers, 2005, 435–444. - Introductory article on reversible computing - First International Workshop on reversible computing - Recent publications of Michael P. Frank - Internet Archive backup of the "Reversible computing community Wiki" that was administered by Dr. Frank - Recent Workshops on Reversible Computation - Open-source toolkit for reversible circuit design
<urn:uuid:4472467c-dc1c-4ccd-b6a1-5b4922cecbe9>
CC-MAIN-2014-15
http://en.wikipedia.org/wiki/Reversible_computing
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609538787.31/warc/CC-MAIN-20140416005218-00299-ip-10-147-4-33.ec2.internal.warc.gz
en
0.898451
1,996
3.625
4
Over 400 million transistors are packed on dual-core chips manufactured using Intel's 45nm process. That'll double soon, per Moore's Law. And it'll still be like computing with pebbles compared to quantum computing. Quantum computing is a pretty complicated subject—uh, hello, quantum mechanics plus computers. I'm gonna keep it kinda basic, but recent breakthroughs like this one prove that you should definitely start paying attention to it. Some day, in the future, quantum computing will be cracking codes, powering web searches, and maybe, just maybe, lighting up our Star Trek-style holodecks. Before we get to the quantum part, let's start with just "computing." It's about bits. They're the basic building block of computing information. They've got two states—0 or 1, on or off, true or false, you get the idea. But two defined states is key. When you add a bunch of bits together, usually 8 of 'em, you get a byte. As in kilobytes, megabytes, gigabytes and so on. Your digital photos, music, documents, they're all just long strings of 1s and 0s, segmented into 8-digit strands. Because of that binary setup, a classical computer operates by a certain kind of logic that makes it good at some kinds of computing—the general stuff you do everyday—but not so great at others, like finding ginormous prime factors (those things from math class), which are a big part of cracking codes. Quantum computing operates by a different kind of logic—it actually uses the rules of quantum mechanics to compute. Quantum bits, called qubits, are different from regular bits, because they don't just have two states. They can have multiple states, superpositions—they can be 0 or 1 or 0-1 or 0+1 or 0 and 1, all at the same time. It's a lot deeper than a regular old bit. A qubit's ability to exist in multiple states—the combo of all those being a superposition—opens up a big freakin' door of possibility for computational powah, because it can factor numbers at much more insanely fast speeds than standard computers. Entanglement—a quantum state that's all about tight correlations between systems—is the key to that. It's a pretty hard thing to describe, so I asked for some help from Boris Blinov, a professor at the University of Washington's Trapped Ion Quantum Computing Group. He turned to a take on Schrödinger's cat to explain it: Basically, if you have a cat in a closed box, and poisonous gas is released. The cat is either dead, 0, or alive, 1. Until I open the box to find out, it exists in both states—a superposition. That superposition is destroyed when I measure it. But suppose I have two cats in two boxes that are correlated, and you go through the same thing. If I open one box and the cat's alive, it means the other cat is too, even if I never open the box. It's a quantum phenomenon that's a stronger correlation than you can get in classical physics, and because of that you can do something like this with quantum algorithms—change one part of the system, and the rest of it will respond accordingly, without changing the rest of the operation. That's part of the reason it's faster at certain kinds of calculations. The other, explains Blinov, is that you can achieve true parallelism in computing—actually process a lot of information in parallel, "not like Windows" or even other types of classic computers that profess parallelism. So what's that good for? For example, a password that might take years to crack via brute force using today's computers could take mere seconds with a quantum computer, so there's plenty of crazy stuff that Uncle Sam might want to put it to use for in cryptography. And it might be useful to search engineers at Google, Microsoft and other companies, since you can search and index databases much, much faster. And let's not forget scientific applications—no surprise, classic computers really suck at modeling quantum mechanics. The National Institute of Science and Technology's Jonathan Home suggests that given the way cloud computing is going, if you need an insane calculation performed, you might rent time and farm it out to a quantum mainframe in Google's backyard. The reason we're not all blasting on quantum computers now is that this quantum mojo is, at the moment, extremely fragile. And it always will be, since quantum states aren't exactly robust. We're talking about working with ions here—rather than electrons—and if you think heat is a problem with processors today, you've got no idea. In the breakthrough by Home's team at NIST—completing a full set of quantum "transport" operations, moving information from one area of the "computer" to another—they worked with a single pair of atoms, using lasers to manipulate the states of beryllium ions, storing the data and performing an operation, before transferring that information to a different location in the processor. What allowed it to work, without busting up the party and losing all the data through heat, were magnesium ions cooling the beryllium ions as they were being manipulated. And those lasers can only do so much. If you want to manipulate more ions, you have to add more lasers. Hell, quantum computing is so fragile and unwieldy that when we talked to Home, he said much of the effort goes into methods of correcting errors. In five years, he says, we'll likely be working with a mere tens of qubits. The stage it's at right now, says Blinov, is "the equivalent of building a reliable transistor" back in the day. But that's not to say those of tens of qubits won't be useful. While they won't be cracking stuff for the NSA—you'll need about 10,000 qubits for cracking high-level cryptography—that's still enough quantum computing power to calculate properties for new materials that are hard to model with a classic computer. In other words, materials scientists could be developing the case for the iPhone 10G or the building blocks for your next run-of-the-mill Intel processor using quantum computers in the next decade. Just don't expect a quantum computer on your desk in the next 10 years. Special thanks to National Institute of Standards and Technology's Jonathan Home and the University of Washington Professor Boris Blinov! Still something you wanna know? Send questions about quantum computing, quantum leaps or undead cats to email@example.com, with "Giz Explains" in the subject line.
<urn:uuid:04ddff00-ff41-4ce6-8cea-13d2e3c1901e>
CC-MAIN-2014-15
http://gizmodo.com/5335901/giz-explains-why-quantum-computing-is-the-future-but-a-distant-one?tag=schrodinger.s-cat
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609523429.20/warc/CC-MAIN-20140416005203-00204-ip-10-147-4-33.ec2.internal.warc.gz
en
0.938027
1,387
3.5625
4
University of Utah physicists stored information for 112 seconds in what may become the world's tiniest computer memory: magnetic "spins" in the centers or nuclei of atoms. Then the physicists retrieved and read the data electronically -- a big step toward using the new kind of memory for both faster conventional and superfast "quantum" computers. "The length of spin memory we observed is more than adequate to create memories for computers," says Christoph Boehme (pronounced Boo-meh), an associate professor of physics and senior author of the new study, published Friday, Dec. 17 in the journal Science. "It's a completely new way of storing and reading information." However, some big technical hurdles remain: the nuclear spin storage-and-read-out apparatus works only at 3.2 degrees Kelvin, or slightly above absolute zero -- the temperature at which atoms almost freeze to a standstill, and only can jiggle a little bit. And the apparatus must be surrounded by powerful magnetic fields roughly 200,000 times stronger than Earth's. "Yes, you could immediately build a memory chip this way, but do you want a computer that has to be operated at 454 degrees below zero Fahrenheit and in a big national magnetic laboratory environment?" Boehme says. "First we want to learn how to do it at higher temperatures, which are more practical for a device, and without these strong magnetic fields to align the spins." As for obtaining an electrical readout of data held within atomic nuclei, "nobody has done this before," he adds. Two years ago, another group of scientists reported storing so-called quantum data for two seconds within atomic nuclei, but they did not read it electronically, as Boehme and colleagues did in the new study, which used classical data (0 or 1) rather than quantum data (0 and 1 simultaneously). The technique was developed in a 2006 study by Boehme, who showed it was feasible to read data stored in the net magnetic spin of 10,000 electrons in phosphorus atoms embedded in a silicon semiconductor. The new study puts together nuclear storage of data with an electrical readout of that data, and "that's what's new," Boehme says. The study was led by Boehme and first author Dane McCamey, a former research assistant professor of physics at the University of Utah and still an adjunct assistant professor. His main affiliation now is with the University of Sydney. Other co-authors were Hans van Tol of the National High Magnetic Field Laboratory in Tallahassee, Fla., and Gavin Morley of University College London. The study was funded by the National High Magnetic Field Laboratory, the National Science Foundation, the Australian Research Council, Britain's Engineering and Physical Sciences Research Council and the Royal Commission for the Exhibition of 1851, a British funding agency led by Prince Philip. Of Electronic and Spintronic Memories Modern computers are electronic, meaning that information is processed and stored by flowing electricity in the form of electrons, which are negatively charged subatomic particles that orbit the nucleus of each atom. Transistors in computers are electrical switches that store data as "bits" in which "off" (no electrical charge) and "on" (charge is present) represent one bit of information: either 0 or 1. Quantum computers -- a yet-unrealized goal -- would run on the odd principles of quantum mechanics, in which the smallest particles of light and matter can be in different places at the same time. In a quantum computer, one quantum bit or "qubit" could be both 0 and 1 at the same time. That means quantum computers theoretically could be billions of times faster than conventional computers. McCamey says a memory made of silicon "doped" with phosphorus atoms could be used in both conventional electronic computers and in quantum computers in which data is stored not by "on" or "off" electrical charges, but by "up" or "down" magnetic spins in the nuclei of phosphorus atoms. Externally applied electric fields would be used to read and process the data stored as "spins" -- just what McCamey, Boehme and colleagues did in their latest study. By demonstrating an ability to read data stored in nuclear spins, the physicists took a key step in linking spin to conventional electronics -- a field called spintronics. Spin is an unfamiliar concept to comprehend. A simplified way to describe spin is to imagine that each particle -- like an electron or proton in an atom -- contains a tiny bar magnet, like a compass needle, that points either up or down to represent the particle's spin. Down and up can represent 0 and 1 in a spin-based quantum computer. Boehme says the spins of atoms' nuclei are better for storing information than the spin of electrons. That's because electron spin orientations have short lifetimes because spins are easily changed by nearby electrons and the temperature within atoms. In contrast, "the nucleus sits in the middle of an atom and its spin isn't messed with by what's going on in the clouds of electrons around the nucleus," McCamey says. "Nuclei experience nearly perfect solitude. That's why nuclei are a good place to store information magnetically. Nuclear spins where we store information have extremely long storage times before the information decays." The average 112 second storage time in the new study may not seem long, but Boehme says the dynamic random access memory (DRAM) in a modern PC or laptop stores information for just milliseconds (thousandths of a second). The information must be repeatedly refreshed, which is how computer memory is maintained, he adds. How to Store and Then Read Data in the Spins of Atomic Nuclei For the experiments, McCamey, Boehme and colleagues used a thin, phosphorus-doped silicon wafer measuring 1 millimeter square, and placed electrical contacts on it. The device was inside a supercold container, and surrounded by intense magnetic fields. Wires connected the device to a current source and an oscilloscope to record data. The physicists used powerful magnetic fields of 8.59 Tesla to align the spins of phosphorus electrons. That's 200,000 times stronger than Earth's magnetic field. Then, pulses of near-terahertz electromagnetic waves were used to "write" up or down spins onto electrons orbiting phosphorus atoms. Next, FM-range radio waves were used to take the spin data stored in the electrons and write it onto the phosphorus nuclei. Later, other pulses of near-terahertz waves were used to transfer the nuclear spin information back into the orbiting electrons, and trigger the readout process. The readout is produced because the electrons' spins are converted into variations in electrical current. "We read the spin of the nuclei in the reverse of the way we write information," Boehme says. "We have a mechanism that turns electron spin into a current." Summarizing the process, Boehme says, "We basically wrote 1 in atoms' nuclei. We have shown we can write and read [spin data in nuclei]," and shown that the information can be repeatedly read from the nuclei for an average of 112 seconds before all the phosphorus nuclei lose their spin information. In a much shorter time, the physicists read and reread the same nuclear spin data 2,000 times, showing the act of reading the spin data doesn't destroy it, making the memory reliable, Boehme says. Reading out the data stored as spin involved reading the collective spins of a large number of nuclei and electrons, Boehme says. That will work for classical computers, but not for quantum computers, for which readouts must be able to discern the spins of single nuclei, he adds. Boehme hopes that can be achieved within a few years. - D. R. Mccamey, J. Van Tol, G. W. Morley and C. Boehme. Electronic Spin Storage in an Electrically Readable Nuclear Spin Memory with a Lifetime >100 Seconds. Science, 17 December 2010: Vol. 330 no. 6011 pp. 1652-1656 DOI: 10.1126/science.1197931 Cite This Page:
<urn:uuid:ba262450-8185-468e-873e-0dbbe1badfd4>
CC-MAIN-2014-15
http://www.sciencedaily.com/releases/2010/12/101216142511.htm
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609538824.34/warc/CC-MAIN-20140416005218-00638-ip-10-147-4-33.ec2.internal.warc.gz
en
0.93624
1,702
3.53125
4
Physicists at the National Institute of Standards and Technology (NIST) have demonstrated entanglement—a phenomenon peculiar to the atomic-scale quantum world—in a mechanical system similar to those in the macroscopic everyday world. The work extends the boundaries of the arena where quantum behavior can be observed and shows how laboratory technology might be scaled up to build a functional quantum computer. The research, described in the June 4 issue of Nature, involves a bizarre intertwining between two pairs of vibrating ions (charged atoms) such that the pairs vibrate in unison, even when separated in space. Each pair of ions behaves like two balls connected by a spring (see figure), vibrating back and forth in opposite directions. Familiar objects that vibrate this way include pendulums and violin strings. The NIST achievement provides insights into where and how "classical" objects may exhibit unusual quantum behavior. The demonstration also showcased techniques that will help scale up trapped-ion technology to potentially build ultra-powerful computers relying on the rules of quantum physics. If they can be built, quantum computers may be able to solve certain problems, such as code breaking, exponentially faster than today's computers. (For further details, see: http://www.nist.gov/public_affairs/quantum/quantum_info_index.html.) "Where the boundary is between the quantum and classical worlds, no one really knows," says NIST guest researcher John Jost, a graduate student at the University of Colorado at Boulder and first author of the paper. "Maybe we can help answer the question by finding out what types of things can—and cannot be—entangled. We've entangled something that has never been entangled before, and it's the kind of physical, oscillating system you see in the classical world, just much smaller." Mechanical oscillators like two pendulum-based clocks have previously been synchronized, but their vibrations can still be independent, so that changes in one have no effect on the other. Quantum entanglement—"spooky action at a distance," in Einstein's words—is a far more counterintuitive process: If two objects are entangled, then manipulating one instantaneously affects the other, no matter how far away it is. Entangled objects do not necessarily have identical properties, just properties that are linked in predictable ways. Jost and colleagues entangled the vibrational motions of two separated mechanical oscillators, each consisting of one beryllium and one magnesium ion. Each pair behaves like two objects connected by a spring 4 micrometers (millionths of a meter) long, with the beryllium and magnesium moving back and forth in opposite directions, first toward each other, then away, then back again. The two pairs perform this motion in unison, even though they are 240 micrometers apart and are located in different zones of an ion trap. The scientists created the desired entangled state at least 57 percent of the time they tried, and have identified ways to improve the success rate. The NIST experiments suggest that mechanical oscillators can take part in both the quantum and classical worlds, possessing some features of each, depending in part on the energy and other properties of the vibrations. The experiments also achieved the first combined demonstration of arranging different ions into a desired order, separating and re-cooling them while preserving entanglement, and then performing subsequent quantum operations on the ions. These techniques could help scientists build large-scale quantum computers that use hundreds of ions to store data and perform many computational steps. The same NIST group has previously demonstrated the basic building blocks of a quantum computer using ion traps, as well as rudimentary logic operations. To entangle the motion of the two oscillators, the NIST group first placed four ions together in one trap zone in a particular linear order (Be-Mg-Mg-Be), and entangled the internal energy states of the two beryllium ions. The team then separated the four ions into two pairs, with each pair containing one of the entangled ions. Finally, the scientists transferred the entanglement from the beryllium ions' internal states to the oscillating motions of the separated ion pairs. The research was funded in part by the Intelligence Advanced Research Projects Activity. The authors include former NIST post-doctoral scholars who are currently at the Weizmann Institute of Science in Israel and Lockheed Martin of Littleton, Colo. How NIST Entangled Two Mechanical Oscillators NIST physicists entangled two vibrating mechanical systems each consisting of one beryllium and one magnesium ion, in an experiment that required 14 milliseconds, including verification of results, and involved about 600 laser pulses. The steps below expand on information provided in the figure. Step 1—Initially, all four ions are placed in the same zone of an ion trap and cooled with lasers to very low temperatures. By tuning the voltages of the trap electrodes scientists arrange the ions in a particular order, with both heavier magnesium ions between the beryllium ions. Using a technique developed for quantum computing several years ago, scientists entangle the two beryllium ions' internal "spin states," which are analogous to tiny bar magnets pointing up or down. Two ultraviolet laser beams, positioned at right angles, cause the ions to oscillate. The lasers are tuned so the difference between their frequencies is very close to the frequency of one of the ions' natural vibrations, the rate at which it likes to oscillate back and forth. Based on differences in their spins, the ions "feel" a differing laser force that causes the ions to oscillate in a particular way. This coupling of the spin states to motion has the global effect of entangling the spins of the beryllium ions in a controlled way. Step 2—Voltages are then applied to electrode X to separate the ions into two pairs, which are distributed to different trap zones located adjacent to electrodes A and B. The separation and transport boost the energy of motion in the oscillating ions. Step 3—The magnesium ions are cooled with lasers to remove excess motional energy from the beryllium ions, a process called sympathetic cooling because one type of ion cools the other. This is the first time entangled ions have been re-cooled prior to further operations, a technique expected to be useful in computing. Step 4—By manipulating laser beam colors and orientations in a sequence of pulses of specific intensity and duration, scientists transfer the entanglement from the beryllium spins to the motion. The two mechanical oscillators are now entangled. Under ideal conditions, the beryllium and magnesium ions are oscillating back and forth in opposite directions, toward each other and then away. The two pairs perform this motion in unison, even though they are 240 micrometers apart and are located in different zones of the trap. Scientists are not able to measure the entangled motions directly. Instead, to verify the results, they conduct a cleanup procedure partway through the experiment to ensure the entanglement has been transferred successfully from the ions' spin to their mechanical motion. Then, at the end of the experiment, they essentially reverse the entire process to transfer the entanglement from the ion motion back to the spins, to reproduce the initial beryllium spin states, which they can measure through the light scattered by the beryllium ions (spin up scatters laser light, whereas spin down does not). The above story is based on materials provided by National Institute of Standards and Technology (NIST). Note: Materials may be edited for content and length. - J. D. Jost, J. P. Home, J. M. Amini, D. Hanneke, R. Ozeri, C. Langer, J. J. Bollinger, D. Leibfried & D. J. Wineland. Entangled mechanical oscillators. Nature, 2009; 459 (7247): 683 DOI: 10.1038/nature08006 Cite This Page:
<urn:uuid:acde08b0-f743-4ea2-97fd-b0cfaf8a90cf>
CC-MAIN-2014-15
http://www.sciencedaily.com/releases/2009/06/090603131429.htm
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609535535.6/warc/CC-MAIN-20140416005215-00262-ip-10-147-4-33.ec2.internal.warc.gz
en
0.914263
1,644
3.859375
4
SSL, or Secure Socket Layer, was first developed by Netscape in the mid-1990's to address the growing need to be able to securely transmit data. It protects data, verifies legitimacy of a website, and is supported by all major browsers. When you log into a banking website, your computer is sent a file called an "SSL certificate" which contains the following data: Based on the certificate's info, your browser decides whether or not to trust the certificate. This is possible because it uses third-party data, already in your browser, to confirm the certificate wasn't sent by a hacker. Once the certificate is received, the browser checks that the certificate was issued by a trusted third party known as a certificate authority. The browser then uses the public key to encrypt a random, symmetric encryption key and sends it to the server. The web server then decrypts the symmetric encryption key using its private key and uses the symmetric key to decrypt the URL and the HTTP data. Finally, the browser decrypts a response from the server using the symmetric key and displays the information. Due to the nature of the Internet, the path the content follows between a server and a web browser is not secure. There is always the possibility someone is using a "packet sniffer" to capture data as it passes through a network or, if you're wireless, right out of the air. This is where encryption comes in. Originally, SSL used 40-bit encryption, meaning the value of the key used to decrypt data was selected from 1 out of 1,099,511,627,776 possible values. Today, that level of encryption can be broken almost instantly; so, a 128-bit encryption is commonly used which means 340,282,366,920,938,463,463,374,607,431,768,211,456 possible values; increase it to 256 bits for more security and you have the theoretical number of atoms in the universe. Even with millions of today's top-of-the-line computers working together, brute-force decryption simply takes too long if data is encrypted properly. That said, it's always best to be paranoid because future technologies like quantum computing may render conventional encryption obsolete. If a brute-force attack won't work, how else can SSL be compromised? No matter how air-tight a security system is, all that work is pointless if users trusted with access have weak passwords or can be tricked into providing their passwords. Although not SSL-specific, it's vital best practices are used to prevent non-technical, "social engineering" attacks. There is also the possibility that browser and/or server flaws could be exploited. A good way to minimize the risk of a hacker taking advantage of exploits is to subscribe to twitter feeds or blogs related to web security. This way, vulnerabilities can be fixed shortly after they're made public. Another approach would be to establish a list of supported browsers so that you can block or redirect users whose browsers aren't secure. Flaws in SSL itself could potentially be identified and exploited. SSL supports multiple types of encryption and, in 2008, researchers were able to spoof a certificates by exploiting md5 encryption. This was done with an array of 200 PlayStation 3's and it was made possible because some certificate authorities relied on md5 alone. So, the reliability of an SSL certificate is directly related to the reliability of its certificate authority. If a certificate authority issues an SSL to a hacker's site, users could be fooled into thinking they are on a legitimate site due to successful SSL authentication. Furthermore, some authorities use better encryption methods than others. You can get a certificate from GoDaddy for $70/year or you can spend at least $695 at Symantec. Guess which business takes security more seriously! First, there's a yearly cost associated with SSL which must be weighed against the security benefit. Is there any data on the site that any hackers might use or is there any motivation for your site to be hacked more than another site? If you're doing financial transactions then you pretty much have to use SSL or users will not feel secure, not to mention it would be an obvious target for hackers. That said, if your site only contains openly shared data and is backed up regularly, the biggest risks might be that an admin's password could be captured or that users might use the same password on other sites that do contain sensitive data. SSL also uses additional server resources encrypting and decrypting content. Although the difference is minor due to processing power of today's servers, it can be noticeable on high-traffic sites. If you want to mix secure and non-secure content on the same page then users may get a browser warnings, so this limits the ability to host some content elsewhere; for example, a content distribution network. Finally, extra time is needed to purchase the certificate, set up the server, configure the website, and test. Sometimes SSL is a given, but it can be more of a qualitative question based on the balance between practicality and ideology. Yes, any unencrypted login is vulnerable to attack, but what are the chances? The best thing do is weigh the overall cost of SSL against how sensitive your content is and what might happen, worst case,if it is compromised. If you're not sure whether or not to use SSL but you have the money and don't see any major technical obstacles then go ahead and use it. A less expensive alternative might be to integrate a service like PayPal that handles authentication outside your website. On the other hand, if SSL's authentication and encryption aren't enough, consider using physical tokens. A physical token is a device that assists with authentication. For example, the device may periodically display a different value used to log in based on the current time. This approach removes the reliance in the certificate authority and allows more control over who has access. It can even be used to establish a VPN connection to the server before the website can be accessed. When configuring Drupal to use SSL, a good place to start is the Secure Pages modules which lets you define which pages are secure and handles redirects from or to secure pages as needed. If you're using Secure Pages with Drupal 6 then the Secure Pages Prevent Hijack module should be installed to prevent hijacked sessions from access SSL pages. Also, the Auth SSL Redirect module can be used to redirect authenticated users to SSL and it will work in conjunction with Secure Pages. If you're using Ubercart and want to either secure the whole site or just Ubercart pages then another option is Ubercart SSL and it can be extended to secure additional pages. In general, these modules help manage transitions between secure and insecure pages. [Updated based on comment feedback.] What do you think, what approaches do you recommend, and what do you recommend against?
<urn:uuid:7828be9a-ee2e-4c32-898c-a387bf79dadf>
CC-MAIN-2014-15
http://www.mediacurrent.com/blog/secure-authentication-and-drupal
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609521558.37/warc/CC-MAIN-20140416005201-00535-ip-10-147-4-33.ec2.internal.warc.gz
en
0.932027
1,392
4
4
Scientists find a way to directly measure quantum states, such as momentum, of photons. Credit: MPQ, Quantum Dynamics Division. Quantum computers and communications promise more powerful machines and unbreakable codes. But to make them work, it's necessary to measure the quantum state of particles such as photons or atoms. Quantum states are numbers that describe particle characteristics such as momentum or energy. But measuring quantum states is difficult and time-consuming, because the very act of doing so changes them, and because the mathematics can be complex. Now, an international team says they found a more efficient way to do it, which could make it simpler to build quantum-mechanical technologies. In a study detailed in the Jan. 20 issue of the journal Nature Communications, researchers from the University of Rochester and the University of Glasgow took a direct measurement of a photon's 27-dimensional quantum state. These dimensions are mathematical, not dimensions in space, and each one is a number that stores information. To understand a 27-dimensional quantum state, think about a line described in two dimensions. A line would have a direction in the X and Y coordinates — 3 inches left and 4 inches up, for instance. The quantum state has 27 such coordinates. [Quantum Physics: The Coolest Little Particles in Nature] "We chose 27, kind of to make a point about 26 letters in the alphabet and throwing in one more," said Mehul Malik, now a postdoctoral researcher at theUniversity of Vienna. That means each quantum bit, or "qubit," could store a letter instead of a simple 1 or 0. Seeing a photon The group, led by Malik and Robert Boyd, a professor of optics and physics at the University of Rochester, was able to see a photon's states directly. They measured the photon's orbital angular momentum, which is how much the particles of light "twist" as they travel through space. Ordinarily, finding the quantum state of a photon requires a two-step process. First, scientists have to measure some property of the photon, such as its polarization or momentum. The measurements are performed on many copies of the quantum state of a photon. But that process sometimes introduces errors. To get rid of the errors, the scientists have to look at what results they got that are "disallowed" states — ones that don't follow the laws of physics. But the only way to find them is to search through all the results and discard the ones that are impossible. That eats up a lot of computing time and effort. This process is called quantum tomography. [The 9 Biggest Unsolved Mysteries in Physics] A light wave is a combination of an electric and magnetic field, each of which oscillates and makes a wave. Each wave moves in time with the other, and they are perpendicular to each other. A beam of light is made up of lots of these waves. Light can have what is called orbital angular momentum. In a beam with no orbital angular momentum, the peaks of the waves — the electric ones, for example — are lined up. A plane connecting these peaks will be flat. If the beam has orbital angular momentum, a plane connecting these peaks will make a spiral, helical pattern, because the light waves are offset from one another slightly as you go around the beam. To measure the state of the photons, scientists must "unravel" this helical shape of the waves in the beam. Measuring a photon's quantum state The team first fired a laser through a piece of transparent polymer that refracted the light, "unraveling" the helix formed by the waves. The light then passed through special lenses and into a grating that makes many copies of the beam. After passing through the grating, the light is spread out to form a wider beam. After the beam is widened, it hits a device called a spatial light modulator. The modulator carries out the first measurement. The beam then reflects back in the same direction it came from and passes through a beam splitter. At that point, part of thebeam moves toward a slit, which makes a second measurement. [Twisted Physics: 7 Mind-Blowing Experiments] One of the two measurements is called "weak" and the other "strong." By measuring two properties, the quantum state of the photons can be reconstructed without the lengthy error-correction calculations tomography requires. In quantum computers, the quantum state of the particle is what stores the qubit. For instance, a qubit can be stored in the photon's polarization or its orbital-angular momentum, or both. Atoms can also store qubits, in their momenta or spins. Current quantum computers have only a few bits in them. Malik noted that the record is 14 qubits, using ions. Most of the time, ions or photons will only have acouple of bits they can store, as the states will be two-dimensional. Physicists use two-dimensional systems because that is what they can manipulate — it would be very difficult to manipulate more than two dimensions, he said. Direct measurement, as opposed to tomography, should make it easier to measure the states of particles (photons, in this case). That would mean it is simpler to add more dimensions — three, four or even — as in this experiment, 27 — and store more information. Mark Hillery, a professor of physics at Hunter College in New York, was skeptical that direct measurement would prove necessarily better than current techniques. "There is a controversy about weak measurements — in particular, whether they really are useful or not," Hillery wrote in an email to LiveScience. "To me, the main issue here is whether the technique they are using is better (more efficient) than quantum-state tomography for reconstructing the quantum state, and in the conclusion, they say they don't really know." Jeff Savail, a master's candidate researcher at Canada's Simon Fraser University, worked on a similar direct measurement problem in Boyd's lab, and his work was cited in Malik's study. In an email he said one of the more exciting implications is the "measurement problem." That is, in quantum mechanical systems the question of why some measurements spoil quantum states while others don't is a deeper philosophical question than it is about the quantum technologies themselves. "The direct measurement technique gives us a way to see right into the heart of the quantum state we're dealing with," he said. That doesn't mean it's not useful – far from it. "There may also be applications in imaging, as knowing the wave function of the image, rather than the square, can be quite useful." Malik agreed that more experiments are needed, but he still thinks the advantages might be in the relative speed direct measurement offers. "Tomography reduces errors, but the post-processing [calculations] can take hours," he said.
<urn:uuid:6986ea97-aa5c-4e7e-ba93-50f9f036a88c>
CC-MAIN-2014-15
http://www.livescience.com/42899-physicists-measure-photons-quantum-states.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609525991.2/warc/CC-MAIN-20140416005205-00215-ip-10-147-4-33.ec2.internal.warc.gz
en
0.955368
1,416
4.3125
4
Entanglement is one of the defining properties that distinguishes quantum systems from their classical counterparts. It refers to correlations between measurement outcomes on distinct (and potentially distant) degrees of freedom of a system that are stronger than those found in any classical experiment. Quantum entanglement is the key resource that enables the dramatic speedup of calculations in a quantum computer, as well as various other quantum information processing tasks. In a paper appearing in Physical Review Letters , Haohua Wang and co-workers at the University of California, Santa Barbara, US, Zheijiang University, China, and NEC Corporation, Japan, have experimentally demonstrated entanglement between two spatially separated oscillating electrical circuits. This experiment represents the latest step by these researchers towards the engineering of large scale networks of controlled, entangled systems, which might be useful as quantum computers , or for engineering new states of quantum matter [3, 4]. Harmonic oscillators would seem to be ideal building blocks for constructing highly entangled states. The harmonic oscillator is a particularly well studied exemplar: the classical physics of harmonic oscillators—such as a mass accelerated by the linear restoring force provided by a spring—is understood by high school physics students, whereas the quantum harmonic oscillator is one of the first systems to be dealt with in undergraduate quantum mechanics courses. The quantum dynamics of a harmonic oscillator can be solved exactly, and such solutions are often the starting point in the understanding of quantum field theory. Harmonic oscillators are ubiquitous in physics, and many realizations of such oscillators can be found, ranging from mechanical systems, electrical circuits, and lattice vibrations to elementary excitations of the electromagnetic field (photons). In the context of the experiment carried out by Wang et al., each harmonic oscillator consists of a coplanar waveguide resonator—equivalent to a circuit comprising a capacitor and an inductor (see Fig. 1). This resonator is superconducting at low temperature and can store excitations for a long time (in other words, it takes a relatively long time for excitations to decay—around ). Excitations of this circuit can be thought of as photons—excitations of the electromagnetic field associated with the circuit elements. Unfortunately, the simplicity of such harmonic oscillators means that a quantum system consisting solely of linearly coupled oscillators (that is, where the Hamiltonian contains coupling terms that are, at most, bilinear in the coordinate or conjugate momentum of each oscillator) is insufficient for many quantum information processing tasks. Such linear systems are not capable of implementing arbitrary quantum algorithms (unless augmented with additional resources, such as single photon sources or photon-counting detectors ). Driving a harmonic oscillator with a classical oscillating field only allows the preparation of a restricted class of states, known as coherent states, and with only linear couplings between oscillators, it is not possible to transform such coherent initial states into entangled states of multiple oscillators. One can gain some insight into this restriction by considering the spectrum of a quantum harmonic oscillator: all the energy levels of the oscillator are equally spaced, and so it is not possible to address resonant transitions between a pair of states without also driving transitions between all other states. Wang et al.’s experiment overcomes these limitations with the inclusion of nonlinear circuit elements: each oscillator is coupled to a superconducting phase qubit, so called because the quantum information is represented by the phase difference between the superconducting condensates on either side of an insulating barrier known as a Josephson junction (see Fig. 1) . The Josephson junction is connected in parallel with a capacitor and an inductance loop. The Josephson junction adds a sinusoidal potential to the Hamiltonian of the system. The circuit is therefore no longer a purely harmonic oscillator, which means that the energy levels of the phase qubit are not equally spaced. This allows one to resonantly address a single pair of states of the phase qubit with an oscillating external field, allowing the preparation of single excitations of the qubit. By means of a sequence of such driving pulses, together with a bias flux that shifts the qubit energy levels in and out of resonance with the central coupling resonator (denoted by C in Fig. 1), an entangled state can be established between the two phase qubits. Subsequently, this entangled state can be swapped onto the resonators A and B to create a state of the form . More complicated sequences of driving pulses, using higher lying states of the qubit, are used to engineer more general entangled states of the oscillators, such as the -photon NOON state, , which is a superposition of two -photon states, where one state has all photons in resonator A, and the other state has all photons in resonator B . In the final stage of each experiment, the joint state of the two-oscillator system is established with a method called quantum tomography, again with the aid of the phase qubits. Tomography is a process by which the complete state of a quantum system can be determined via repeated preparation and measurement steps. The technique used here is a generalization of a technique pioneered by the University of California, Santa Barbara, group in a recent elegant experiment . With tomographic data, Wang et al. are able to verify the entanglement of the final state, demonstrating entangled NOON states with up to three photons, although the fidelity of these entangled states is reduced somewhat by the short coherence time of the phase qubit. The entanglement in this experiment is truly macroscopic—each resonator is almost a centimeter long, and the resonators are separated by —and therefore the entangled systems are large enough to be easily resolved by the naked eye. Numerous exciting possibilities are opened up by the techniques developed in this experiment. The ability to deterministically generate NOON states may have applications in Heisenberg limited metrology, that is, quantum assisted measurement, with accuracy beyond that implied by the standard shot noise limit . The superconducting oscillators in Wang et al.’s experiment have a comparatively long coherence time (), which is an order of magnitude larger than the coherence time for the phase qubits. Thus these oscillators may form a useful substrate for a microwave-frequency “linear optical quantum computer” . Such a device may be easier to construct than a conventional, gate model quantum computer, and yet would have equivalent computational power . Finally, large-scale networks of superconducting oscillators may be used to engineer new, exotic states of quantum matter. Recently, networks have been proposed that allow one to study the physics of a system of photons with broken time reversal symmetry , potentially allowing analogs of the quantum Hall effect for photons. Networks of entangled superconducting elements might also be useful as “protected qubits” [4, 11]; that is, qubits that are inherently protected from environmental noise. These fascinating theoretical proposals might once have seemed far-fetched, but continued experimental progress along the lines reported by Wang et al. gives reason to be optimistic that they may be realized in the not-too-distant future. - H. Wang, M. Mariantoni, R. C. Bialczak, M. Lenander, E. Lucero, M. Neeley, A. D. O’Connell, D. Sank, M. Weides, J. Wenner, T. Yamamoto, Y. Yin, J. Zhao, J. M. Martinis, and A. N. Cleland, Phys. Rev. Lett. 106, 060401 (2011). - M. A. Nielsen and I. L. Chuang, Quantum Computation and Quantum Information (Cambridge University Press, Cambridge, 2000)[Amazon][WorldCat]. - J. Koch, A. A. Houck, K. Le Hur, and S. M. Girvin, Phys. Rev. A 82, 043811 (2010); See also the Viewpoint commentary by A. D. Greentree and A. M. Martin, Physics 3, 85 (2010). - L. B. Ioffe and M. V. Feigel’man, Phys. Rev. B 66, 224503 (2002). - S. L. Braunstein and P. van Loock, Rev. Mod. Phys. 77, 513 (2005). - E. Knill, R. Laflamme, and G. J. Milburn, Nature 409, 46 (2001). - M. H. Devoret, A. Wallraff, and J. M. Martinis, arXiv:cond-mat/0411174v1. - H. Lee, P. Kok, and J. P. Dowling, J. Mod. Opt. 49, 2325 (2002). - M. Hofheinz et al., Nature 459, 546 (2009). - L. Chirolli, G. Burkard, S. Kumar, and D. P. DiVincenzo, Phys. Rev. Lett. 104, 230502 (2010). - A. Kitaev, arXiv:cond-mat/0609441v2.
<urn:uuid:b3e2a77b-3016-40ca-9fec-fef2a20afe61>
CC-MAIN-2014-15
http://physics.aps.org/articles/print/v4/11
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609538110.1/warc/CC-MAIN-20140416005218-00634-ip-10-147-4-33.ec2.internal.warc.gz
en
0.904297
1,920
3.9375
4
In 1965, Gordon Moore predicted that processing power should double every eighteen months.1 Traditionally, this rapid growth has been achieved by shrinking distances between transistors and shortening the distance that information needs to pass through.1 However, the miniaturization of processors and transistors will soon reach a physical barrier.2 With this knowledge, researchers have begun searching for new computing systems that take different approaches to achieving greater efficiency. Many possible computing models have been explored, including optical computing, quantum computing, and perhaps most interestingly, biological computing. Biological computing is an altogether very new and very different approach. Rather than attempting to increase the speed of each individual operation, biological computing uses components of living organisms to perform computing tasks faster through massive parallelism, a technique that uses a large number of elements each performing smaller tasks.1 Many recent advances have demonstrated the potential of biological computing, even though research has only begun. For example, Adamatzi and Selim Aki at Queens University demonstrated the ability of slime molds to determine the most efficient paths across networks, and Swiss researchers have successfully programmed human cells to perform binary operations.3 Currently, the preeminent developments in biological computing have occurred in DNA computing. DNA fragments of varying lengths are placed in a solution along with ATP to power the reaction, and the results are analyzed by determining the length and sequence of the output DNA molecule.4 DNA computing allows for the storage of data in a four letter code – “A,” “T,” “C,” and “G” – which is capable of storing far more data more compactly than the binary digit storage of electronic computers.4 In a brilliant example showcasing the potential of DNA computing to revolutionize man-machine interactions, Ehud Shapiro at the Weizmann Institute harnessed DNA computing to diagnose cancerous activity from within the cell and then release an anti-cancer drug based upon the resulting output.4 Advances in biological computing foreshadow a massive revolution in computing technologies by removing physical limitations, improving parallel processing, increasing energy efficiency, and reducing toxicity.1 First, while traditional computational development has relied upon reducing the sizes of and distances between transistors, techniques that will soon face physical limitations, biological computing rapidly increases speed by using more effective parallel processing, which is able to perform 100 times more operations per second than conventional measures.1 Second, biological computing is more energy efficient, relying on energy stored chemically in ATP instead of conventional energy supplies.1 Third, the use of biological components greatly reduces the price and toxicity of computing components, as most biological components are readily available and non-toxic.1 And lastly, biological computing allows for a completely new approach to problem solving: rather than approaching problems sequentially like traditional computers, biological computing is a unique data structure focused upon parallel operations.4 Revolutionizing the computing industry would have groundbreaking impacts in all fields of science, research, technology, and society since computers are crucial for scientific advancement for all scientific and engineering fields. The decreased toxicity, increased availability, and greater energy efficiency of biological computers may lead to massive benefits for the environment. Traditional computers are major contributors to our carbon footprint; by 2020, the carbon emissions from data centers and Internet services is expected to increase four-fold, surpassing even the carbon footprint of the aviation industry.5 In addition, the production of traditional computers requires enormous amounts of natural resources. A single silicon chip requires 1.6 kilograms of fossil fuels, 72 grams of chemicals, and 32 kilograms of water to manufacture, which is all together over 700 times the weight of the final product.6 The disposal of traditional computers is further complicated by the heavy metals they contain, especially lead, mercury, and cadmium, which can easily leak into and contaminate the environment.6 By replacing the need for silicon and other inorganic materials with readily available organic materials, biological computing can help reduce resource strain. Furthermore, the decreased toxicity allows for safer production, storage, and disposal than silicon-based computers. Finally, the improved energy efficiency of biological computing can allow for a decrease in global energy consumption, reducing the strain on fossil fuels and decreasing the amount of pollutants released into the environment due to energy production. This could help reduce damage to ecosystems, decrease biodiversity loss due to toxicity, and combat climate change by decreasing energy consumption. In addition to advancing computing, biological computing also allows for unprecedented advances in medicine and biology by allowing closer integration with living material. Biological data is already used to control the chemicals synthesized by various organisms; the development of organic data processing and memory storage greatly compounds this synergy.1 As demonstrated by the earlier research done by Shapiro on cancer diagnoses and treatment, biological computers could provide a means to treat and diagnose genetically based illnesses from within living organisms.1 For instance, Adamatzky Aki, a leading researcher in DNA computing, has suggested the use of a biological implant to detect and treat breast cancer.3 In addition, biological computing could be used to link silicon-based computing and living organisms. Studies on eels have demonstrated that living things can be linked to robots and controlled, providing the ability for humans to study organisms in unprecedented ways and allowing for advances in interactive prosthetics.1 Biological computing could also allow the introduction of computing in harsher natural environments by mimicking the adaptive strategies of resilient life-forms.3 Overall, these advantages could radically change our ability to garner data for a variety of fields, including biology, animal behavior, and studies in extreme environments. In addition, intimate integration with biological tissue could revolutionize the treatment of cancer and other diseases, transform health care, and pave the way for artificially constructed or controlled organisms that create new opportunities in fields ranging from farming to prosthetics. 1. Fulk, Kevin. “Biological computing.” ISRC Future Technology Topic Brief. 2002. 2. Junnarkar, Sandeep. “Tomorrow’s Tech: The Domino Effect.” CNET News. October 24, 2002. 3. Baer Adam. “Why living cells are the future of data processing.” PopSci. November 5, 2012. 4. Tagore, Somnath; Bhattacharya, Saurav; Islam, Ataul; Islam, Lutful. “DNA Computation: Applications and Perspectives.” Journal of Proteomics & Bioinformatics. June 29, 2010: 234-243. 5. Kanter, James. “The Computer Age and its Carbon Footprint.” New York Times. June 13, 2008. 6. Locklear, Fred. “The Environmental Impact of Computing.” Ars Technica. Nov. 12, 2002.
<urn:uuid:5bb6b33b-1d2a-4ee4-8886-fa98569b2e34>
CC-MAIN-2014-15
http://triplehelixblog.com/2013/04/beyond-silicon-the-evolution-of-biological-computing/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223201753.19/warc/CC-MAIN-20140423032001-00004-ip-10-147-4-33.ec2.internal.warc.gz
en
0.915029
1,366
3.609375
4
March 17, 2013 | 3 Though the concept of the robot seems to be a modern and a relatively new idea, they have been around for years. The first recording in literature of a possible description of the robot is found in the Iliad in reference to a “a three-legged cauldron that had ears for handles”. Later on, in 1900, we were introduced to Tik-Tok in Frank Baum’s Wizard of Oz. The word robot was first used in 1920 by the Czech writer Karel Čapek in his play R.U.R. (Rossum’s Universal Robots). This would be the first dramatization of a robot under this name. However, robots would come to life and be used for practical purposes in 1962. General Motors was the first company to use a robot for industrial purposes. Since then, robots have been used in many ways. They have come in all shapes and sizes. They have been used in the medical field, the armed forces, and in the space program. Now as we face the 21st century, technology evolves more. A new kind of robot is being studied and researched. This robot is called the quantum robot. The quantum robot is the idea of combining quantum theory with robot technology. In other words, it is a practical use of the combination of quantum computing and robot technology. Quantum computing involves using quantum systems and quantum states to do computations. A robot is an automated machine that is capable of doing a set of complex tasks. In some applications of robots, the programming used to run the robots may be based on artificial intelligence. Artificial Intelligence is the ability of a computer system to operate in a manner similar to human intelligence. Think of artificial intelligence as if you were training a machine to act like a human. Essentially, quantum robots are complex quantum systems.They are mobile systems with on board quantum computers that interact with their environments. Several programs would be involved in the operation of the robot. These programs would be quantum searching algorithms and quantum reinforcement learning algorithms. Quantum reinforcement learning is based on superposition of the quantum state and quantum parallelism. A quantum state is a system that is a set of quantum numbers. The four basic quantum numbers represent the energy level, angular momentum, spin, and magnetization. In the superposition of quantum states, the idea is to get one state to look like another. Let’s say I have two dogs. One dog knows how to fetch a bone (energy level), sit up (angular momentum), give a high five (spin), and shake hands (magnetization). Now, let’s apply the superposition of quantum states. Since one dog has been trained and given the commands, the other dog must learn to mimic or copy what the first dog did. Each time a command is achieved, reinforcement is given. The reinforcement for the dog would be a bone (or no bone if the command is not achieved). In quantum reinforcement learning, it is slightly different. The idea would be similar to an “If-Then” statement. An example would be if the quantum state has a certain energy level, then the angular momentum is certain value. This idea of “If-Then” statements in the quantum world leads to an idea which can be a topic of its own; Quantum Logic. Quantum parallelism simply means that computations can happen at the same time. This allows for all of the quantum numbers of the quantum system to be measured at the same time. If there are multiple quantum systems then; by using the concept of parallelism, all systems can be measured at the same time. Programs used for “quantum searching” are based on quantum random walks. Quantum random walks use probability amplitudes. A probability amplitude allows us to determine that there is more than one possible quantum state. In the classical world, if you type a word “Quantum” in the search engine, you get many results. You may have a tough time finding a needle in a haystack if you use just one word, but if you want to refine your search; let’s say “Quantum Random Walks”, then it narrows the search. The same principle applies in quantum computing to get more refined results. However, you are not necessarily searching for words but you are finding information that may correlate to a quantum state. What would be the advantages of the Quantum Robot over the Robot? Quantum robots are more intricate in examining their environments and doing tasks as they apply quantum effects . Because of the complexity in quantum computing, the expectations of the quantum robots would be that they are faster, more accurate, and are able to multitask better than the standard robot. The quantum robots may be able one day to give us better medical diagnoses and better data interpretation in other research fields such as defense research. In medicine, they may be able to detect pathological changes in the body by being injected through the bloodstream. In the space program, they may be able to examine the delicate environments on other planets. In the military, they may be able to detect changes in the magnetic and electric fields. They may be able to help us detect early warnings of disasters more efficiently.
<urn:uuid:62b4c3cd-00db-4d16-af21-1abf9cf89a8b>
CC-MAIN-2014-15
http://blogs.scientificamerican.com/guest-blog/2013/03/17/i-quantum-robot/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223203235.2/warc/CC-MAIN-20140423032003-00348-ip-10-147-4-33.ec2.internal.warc.gz
en
0.944117
1,075
3.578125
4
Quantum computation in diamond Computers do not necessarily have to perform error-free calculations in order to provide perfect results—they only need to correct their errors in a reliable way. And this will become even more important in the future, when it is hoped that quantum computers will solve some tasks several times faster than conventional PCs with computing processes that are very efficient but also prone to disturbances. An international team headed by physicists from the University of Stuttgart and the Stuttgart Max Planck Institute for Solid State Research has now found a way to control the quantum system of a diamond with a small number of nitrogen impurities particularly well. The researchers can thus specifically address quantum bits, i.e. the smallest computing units of a quantum computer, in the diamond and combine several bits to a computing register. They use the new degree of control for a logic operation, which is essential for a quantum computer, and for error correction. The physicists already possess quite accurate knowledge about where the strengths of a quantum computer would be: it could carry out searches in large databases, encodings and decodings, or the research tasks in quantum physics much faster than any conceivable conventional computer today. However, there is still no really clear idea of what the blueprint of a quantum computer should look like; neither is there currently a real favourite among the materials from which quantum processors could be made. Possible options here are ions trapped by electric fields, atoms in optical lattices, devices made of superconductors, or diamonds doped with tiny quantities of nitrogen, for example. Physicists working with Jörg Wrachtrup, professor at the Univ. of Stuttgart and Fellow of the Max Planck Institute for Solid State Research, have been investigating for some time the diamonds which are sporadically interspersed with nitrogen. On the road to the quantum computer, they have now helped the diamonds over several hurdles simultaneously. The Stuttgart-based researchers did this by producing not only a quantum register and thus the counterpart of a conventional processor in a diamond; they were also able to reliably control the register, use it to carry out a logic operation and correct errors in it. “Since we meanwhile understand the quantum mechanics of our system well, we can produce quantum registers using a quite simple approach that doesn’t require complex cryogenic technology or laser systems,” says Jörg Wrachtrup. A quantum register is in a superposition state of several qubits A quantum register always contains individual qubits (short for quantum bits), which can be in one of two states just like conventional bits in order to represent a zero or a one. Unlike conventional bits, however, several qubits can be brought into superposition states in which every individual bit virtually floats between “zero” and “one”. This means each superposition state has a different occurence and these are contained in the quantum register as possibilities. These possibilities can be used like the bits of a conventional computer for some parallel computations. The more quantum bits are combined in a register, the more powerful, but also the more sensitive, is the processor. This is because external disturbances push a qubit only too easily from the floating state between “one” and “zero” towards one of the two options. In the worst case, unwelcome external influences destroy the sensitive superposition and render it useless for parallel computations. The researchers in Stuttgart have now found a remedy for this. Three nuclear spins are combined to a quantum register via a defect The nitrogen defect—physicists call it an NV centre (NV: nitrogen vacancy)—can become a trap for one single electron. An electron also has a spin whose orientation also has an effect on the orientation of the nuclear spin. The electron spin can be switched faster than the nuclear spins, but is more prone to the effect of disturbances. The researchers use it for control commands to the nuclear spins that cannot be transmitted with radio frequency pulses. The electron in the defect thus provides the communication between the nuclear spins in the quantum register. Finally, the physicists use it as a tool to help them read the nuclear spins. A quantum register with fast switch and robust storage device “In the past, the electron of the NV centre has been used as a storage device in order to expand the quantum register,” says Gerald Waldherr, who played a crucial role in the experiments. “We use the electron solely to control the nuclear spins on which the quantum information is stored.” This allows the researchers to use the advantages of both systems: the quantum register can be switched rapidly using an electron spin. The nuclear spins, in contrast, store the information in a relatively reliable way, as they withstand disturbances well. Assisted by the electron spin, the physicists now use an ingenious combination of light and radio frequency pulses to manipulate the three nuclear spins into a superposition state initially: they entangle the nuclear spins. Quantum mechanical entanglement creates a kind of virtual bond between quantum particles so that they know of each other’s existence. Only entangled systems are suitable as quantum registers, because only they allow the parallel operation of the quantum computer. A CNOT gate allows other computing operations In the next step, the researchers showed that logic operations are possible in this quantum register using a CNOT gate—a logic operation that is particularly important for quantum computers. “All other operations can be realised with the CNOT gate and local operations on individual qubits,” explains Gerald Waldherr. The CNOT gate switches a bit depending on a second bit. If the latter represents a “one”, for example, the first one is set from “zero” to “one” or vice versa; it remains unchanged, however, if the latter is at “zero”. The researchers in Stuttgart carried out exactly this operation on the nuclear spins in their register, by sending a sequence of different radio frequency pulses to the NV centre or the nuclear spins. The CNOT gate is not only indispensable for the computing power of a quantum computer, it also makes error correction possible. Although nuclear spins are not as sensitive to interferences as electron spins are, they are by no means immune. Gerald Waldherr and his colleagues demonstrated how possible errors in the quantum register can be cancelled for one of the possible superposition states of their quantum register. To correct the errors, the scientists benefit from the fact that the superposition states are not arbitrary combinations of all possible spin orientations. Rather, in one of these superposition states all qubits are either “one” or “zero”. In another state, two are always “one”. Errors are thus evident immediately. And with the aid of the two intact qubits the original state of the third can be reconstructed. The CNOT operation is the tool of choice for this, because it switches one bit depending on another one. An ingenious sequence of CNOT operations on the three qubits of the quantum register thus not only shows whether one bit deviates from the characteristic pattern of the particular superposition state, it even corrects the error immediately. The plan is to increase the number of qubits in the quantum register “Our current work shows that the defect centres in diamonds are significantly more versatile than we originally thought,” says Jörg Wrachtrup. “We have obtained the new findings primarily through a better understanding of the defects and not by investing much into the material.” The researchers will rely on smart ideas in the future as well, as they try to further improve the prospects of the diamonds in the competition for the most useful quantum register. First they want to increase the number of qubits in their register. To this end, they want to integrate nuclear spins, which find it more difficult to communicate with the electron than the three spins of their current computing register. They could also expand the quantum register, if they succeed in entangling several NV centres and addressing the relevant nuclear spins in the vicinity of the individual centres. They would thus also have networked the nuclear spins, which are controlled by the individual defects. The quantum register would then slowly be approaching a size where it could actually challenge conventional processors for some computing tasks. Source: Max Planck Institute
<urn:uuid:e16ea48c-b311-42ad-a839-f32b4539efc8>
CC-MAIN-2014-15
http://www.rdmag.com/news/2014/02/quantum-computation-diamond
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223211700.16/warc/CC-MAIN-20140423032011-00388-ip-10-147-4-33.ec2.internal.warc.gz
en
0.936596
1,706
3.53125
4
Wires made of individual carbon atoms could be used to reduce the size of today’s microchips several-fold. Carbon nanotubes (CNT) were researched in the past few years and used in initial experimental applications. Nano-engineering now has the task of developing production technologies to make CNT applications commonplace even for the mass market. Sumio Iijima presented the properties of a novel ordered structure of carbon atoms in a paper in Nature in 1991. With their three-dimensional structure and hexagonal arrangement of carbon atoms, the carbon nanotubes (CNT) that he described resemble rolled-up chicken wire. Physicists throughout the whole world were enthusiastic about the material’s promising properties: it was reputed to be stronger than steel and to have a thermal conductivity better than diamond and an electrical conductivity 1000 times higher than copper. Depending on their quantum entanglement, CNT are said to be usable as semiconductors or as conductors. Via theoretical deductions and individual experiments, fundamental research in recent years has shown that CNT are usable as semiconductors to construct transistors – the basic elements of every computer. CNT could make computer microchips many times smaller. This is attractive particularly against the background that the limits of miniaturisation with conventional chip materials will soon be exhausted and the industry urgently needs alternative technologies for further innovations. Dimos Poulikakos, Professor at the Laboratory of Thermodynamics in Emerging Technologies (LTNT), says "Although we now understand the properties of CNT relatively well, we are still at the very beginning when it is a question of how to build such systems, which are invisible to the naked eye and extremely vulnerable to external disturbances. That’s why the work we are doing in our laboratory is really basic research in the engineering sciences." An electrifying, self-organising system Timo Schwamb, Professor Poulikakos’ doctoral student at the LTNT, published a paper in Nano Letters last November describing possible new ways of positioning CNT in nano-electromechanical systems (NEMS). Dielectrophoresis, a technique well-known from electrical engineering, was used by Schwamb for the first time in his experiments with a high success rate for NEMS with more than two electrodes. This enormously expands the application spectrum of CNTs in nanotechnology. Schwamb applies an inhomogeneous electrical field to a microchip previously treated with a droplet of CNT solution. In this process the electrical circuit has a gap at exactly the place where the CNT is to be positioned. The strongest bipolar forces occur at exactly that point, and ultimately attract and align the CNT. Schwamb can also use the method described in the paper to incorporate a four-point measuring method into the NEMS. This eliminates results falsifications in the resistance measurement caused by the soldered joints of the CNT that are used. However, because the electrical fields of four electrodes lying in a plane would get in the way, thus disturbing the positioning of the CNT, the doctoral student decided to use a three-dimensional experimental design. Schwamb explains that: "Although we use a little trick to introduce four electrodes for the four-point measuring system, we nevertheless generate an electrical field equivalent to that of only two electrodes and we use this to position the CNT." This trick works as follows: a gap 0.5 nanometres wide is milled in a three-layer microchip (conductor/insulator/conductor) in such a way that the lowermost layer on both sides projects minimally into the empty space. The two electrodes needed for the four-point measurement can now be “hidden” as it were in the third dimension under the other two electrodes, thus they no longer interfere with the dielectrophoresis. The LTNT is supported by the EMPA (Swiss Federal Laboratories for Materials Testing and Research) in the difficult work on the chip in the range of a few nanometres. Their experts can use an ion beam to mill gaps and steps in the three-layer silicon chip and can solder tiny contact points between the carrier chip and the CNT in a subsequent step using electron beams. Schwamb summarizes: "For the nanotechnology application we can use the new method to combine two known technologies, dielectrophoresis and the four-point measuring technique, in such a way as to create for the first time the potential to mass-produce nano-devices." This could increase the yield in successfully positioning CNTs across four contact points from approx. 3 percent to 40 percent. Nano-engineering: an engineering tradition alive in Switzerland According to Schwamb, the next step will now involve using the new approach to build prototype devices such as transistors and temperature or pressure sensors and to test their properties in a wide variety of ways. However, he says that integrating nano-materials in processes with mass production capability still represents one of the biggest challenges facing nano-technology. However, the most significant benefit of the approach described in the paper is that it can also be used for other nano-particles with interesting properties, and is not limited to CNTs. Poulikakos also detects a piece of Swiss future in nano-engineering: "Switzerland should not only defend its traditional leading position in the area of engineering achievements in the construction of large machinery but should also position itself as a pioneer in the nano-devices field." However, Poulikakos is still unwilling to make any predictions as to when the first CNT computer with a gigantic computing performance will come onto the market and whether or not this will be in Switzerland. Cite This Page:
<urn:uuid:80d179e1-a487-454f-853a-d934dc09e98c>
CC-MAIN-2014-15
http://www.sciencedaily.com/releases/2008/03/080307103832.htm
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223203235.2/warc/CC-MAIN-20140423032003-00351-ip-10-147-4-33.ec2.internal.warc.gz
en
0.941295
1,175
3.640625
4
Synchronized lasers measure how light changes matter Berkeley Lab scientists and their colleagues have successfully probed the effects of light at the atomic scale by mixing x-ray and optical light waves at the Linac Coherent Light Source Light changes matter in ways that shape our world. Photons trigger changes in proteins in the eye to enable vision; sunlight splits water into hydrogen and oxygen and creates chemicals through photosynthesis; light causes electrons to flow in the semiconductors that make up solar cells; and new devices for consumers, industry, and medicine operate with photons instead of electrons. But directly measuring how light manipulates matter on the atomic scale has never been possible, until now. An international team of scientists led by Thornton Glover of the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) used the Linac Coherent Light Source (LCLS) at the SLAC National Accelerator Laboratory to mix a pulse of superbright x-rays with a pulse of lower frequency, "optical" light from an ordinary laser. By aiming the combined pulses at a diamond sample, the team was able to measure the optical manipulation of chemical bonds in the crystal directly, on the scale of individual atoms. The researchers report their work in the August 30, 2012 issue of the journal Nature. Mixing x-rays with light in x-ray diffraction X-ray and optical wave-mixing is an x-ray diffraction technique similar to that long used in solving the structures of proteins and other biological molecules in crystalline form. But in contrast to conventional diffraction, wave mixing selectively probes how light reshapes the distribution of charge in a material. It does this by imposing a distinction between x-rays scattered from optically perturbed charge and x-rays scattered from unperturbed charge. "You can think of the electrons orbiting atoms in a material as belonging to one of two groups," says Glover. "The 'active' electrons are the outer, loosely bound valence electrons that participate in chemical reactions and form chemical bonds. The 'spectator' electrons are the ones tightly wrapped around the nucleus at the atom's core." Glover explains that "because the x-ray photon energy is large compared to the electron binding energy, in a typical scattering experiment all electrons scatter with comparable strength and are therefore more or less indistinguishable." The core-electron signal usually swamps the weaker valence-charge signal because there are many more core electrons than valence electrons. "So x-rays can tell you where atoms are, but they usually can't reveal how the chemically important valence charge is distributed," Glover says. "However, when light is also present with the x-rays, it wiggles some portion of the chemically relevant valence charge. X-rays scatter from this optically driven charge, and in doing so the x-ray photon energy is changed." The modified x-rays have a frequency (or energy) equal to the sum of the frequencies of both the original x-ray pulse and the overlapping optical pulse. The change to a slightly higher energy provides a distinct signature, which distinguishes wave mixing from conventional x-ray diffraction. "Conventional diffraction does not provide direct information on how the valence electrons respond to light, nor on the electric fields that arise in a material because of this response," says Glover. "But with x-ray and optical wave mixing, the energy-modified x-rays selectively probe a material's optically responsive valence charge." Beyond the ability to directly probe atomic-scale details of how light initiates such changes as chemical reactions or phase transitions, sensitivity to valence charge creates new opportunities to track the evolution of chemical bonds or conduction electrons in a material – something traditional x-ray diffraction does poorly. Different components of the valence charge can be probed by tuning the so-called optical pulse; higher-frequency pulses of extreme ultraviolet light, for example, probe a larger portion of valence charge. Because mixing x-ray and optical light waves creates a new beam, which shows up as a slightly higher-energy peak on a graph of x-ray diffraction, the process is called "sum frequency generation." It was proposed almost half a century ago by Isaac Freund and Barry Levine of Bell Labs as a technique for probing the microscopic details of light's interactions with matter, by separating information about the position of atoms from the response of valence charge exposed to light. But sum frequency generation requires intense x-ray sources unavailable until recently. SLAC's LCLS is just such a source. It's a free-electron laser (FEL) that can produce ultrashort pulses of high-energy "hard" x-rays millions of times brighter than synchrotron light sources, a hundred times a second. "The breadth of the science impact of LCLS is still before us," says Jerome Hastings, a professor of photon science at the LCLS and an author of the Nature article. "What is clear is that it has the potential to extend nonlinear optics into the x-ray range as a useful tool. Wave mixing is an obvious choice, and this first experiment opens the door." Diamonds are just the beginning Glover's team chose diamond to demonstrate x-ray and optical wave mixing because diamond's structure and electronic properties are already well known. With this test bed, wave mixing has proved its ability to study light-matter interactions on the atomic scale and has opened new opportunities for research. "The easiest kinds of diffraction experiments are with crystals, and there's lots to learn," Glover says. "For example, light can be used to alter the magnetic order in advanced materials, yet it's often unclear just what the light does, on the microscopic scale, to initiate these changes." Looking farther ahead, Glover imagines experiments that observe the dynamic evolution of a complex system as it evolves from the moment of initial excitation by light. Photosynthesis is a prime example, in which the energy of sunlight is transferred through a network of light-harvesting proteins into chemical reaction centers with almost no loss. "Berkeley Lab's Graham Fleming has shown that this virtually instantaneous energy transfer is intrinsically quantum mechanical," Glover says. "Quantum entanglement plays an important role, as an excited electron simultaneously samples many spatially separated sites, probing to find the most efficient energy-transfer pathway. It would be great if we could use x-ray and optical wave mixing to make real-space images of this process as it's happening, to learn more about the quantum aspects of the energy transfer." Such experiments will require high pulse-repetition rates that free electron lasers have not yet achieved. Synchrotron light sources like Berkeley Lab's Advanced Light Source, although not as bright as FELs, have inherently high repetition rates and, says Glover, "may play a role in helping us assess the technical adjustments needed for high repetition-rate experiments." Light sources with repetition rates up to a million pulses per second may someday be able to do the job. Glover says, "FELs of the future will combine high-peak brightness with high repetition rate, and this combination will open new opportunities for examining the interactions of light and matter on the atomic scale." "X-ray and optical wave mixing," by T.E. Glover, D.M. Fritz, M. Cammarata, T.K. Allison, Sinisa Coh, J.M. Feldkamp, H. Lemke, D. Zhu, Y. Feng, R.N. Coffee, M. Fuchs, S. Ghimire, J. Chen, S. Shwartz, D.A. Reis, S.E. Harris, and J. B. Hastings, appears in the August 30, 2012 issue of Nature. The work was principally supported by the U.S. Department of Energy's Office of Science. Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www.lbl.gov. The Linac Coherent Light Source (LCLS), a division of SLAC National Accelerator Laboratory and a National User Facility, is operated by Stanford University for the US Department of Energy, Office of Science. DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit the Office of Science website at http://science.energy.gov/. Original release: http://www.eurekalert.org/pub_releases/2012-08/dbnl-slm082712.php
<urn:uuid:efbd2e45-7ca9-4242-aee4-452efbcecaf5>
CC-MAIN-2014-15
http://www.ecnmag.com/news/2012/08/synchronized-lasers-measure-how-light-changes-matter?qt-most_popular=0&qt-video_of_the_day=0
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609538110.1/warc/CC-MAIN-20140416005218-00640-ip-10-147-4-33.ec2.internal.warc.gz
en
0.929801
1,858
3.625
4
In 2007, a supergiant star two hundred times bigger than the sun was utterly obliterated by runaway thermonuclear reactions triggered by gamma ray-driven antimatter production. The resulting blast was visible for months because it unleashed a cloud of radioactive material over fifty times the size of our own star, giving off a nuclear fission glow visible from galaxies away. SN 2007bi was discovered by the international Nearby Supernova Factory based at the U.S. Department of Energy's Lawrence Berkeley National Laboratory. The explosion ejected more than 22 solar masses of silicon and other heavy elements into space, including more than six solar masses of radioactive nickel which caused the expanding gases to glow brightly for many months. Continue reading ""The Antimatter Supernova" --One of the Largest Cosmic Explosions Ever Recorded" » It's no accident that we see stars in the sky, says famed Oxford biologist Richard Dawkins: they are a vital part of any universe capable of generating us. But, as Dawkins emphasizes, that does not mean that stars exists in order to make us."It is just that without stars there would be no atoms heavier than lithium in the periodic table," Dawkins wrote in The Ancestors Tale -- A Pilgrimage to the Dawn of Evolution, "and a chemistry of only three elements is too impoverished to support life. Seeing is the kind of activity that can go on only in the kind of universe where what you see is stars." Continue reading ""Alien Edens" ----Evolutionary Biologist Richard Dawkins: 'Life Exists Elsewhere in the Universe' " » It came suddenly from the distant reaches of the Constellation Sagittarius, some 50,000 light years away. For a brief instant, a couple of tenths of a second, on December 27, 2004, an invisible burst of energy the equivalent of half a million years of sunlight shone on Earth. Many orbiting satellites electronics were zapped and the Earth's upper atmosphere was amazingly ionized from a massive hit of gamma ray energy. Continue reading "December 27, 2004 --The Day Planet Earth Survived Its Greatest Space-Ray Attack" » "The answer may be that other regions of the Universe are not quite so favorable for life as we know it, and that the laws of physics we measure in our part of the Universe are merely ‘local by-laws', in which case it is no particular surprise to find life here," says John Webb of the University of New South Wales . Continue reading ""Some Regions of the Universe are Not Favorable for Life" " » "Of all the many glorious images we have received from Saturn, none are more strikingly unusual than those taken from Saturn's shadow," said Carolyn Porco, Cassini's imaging team lead based at the Space Science Institute in Boulder, Colo. Continue reading "Image of the Day: Inside the Shadow of Saturn" » In a major paper in Science, Perimeter Faculty member Xiao-Gang Wen reveals a modern reclassification of all 500 phases of matter. Using modern mathematics, Wen and collaborators reveal a new system which can, at last, successfully classify symmetry-protected phases of matter. Their new classification system will provide insight about these quantum phases of matter, which may in turn increase our ability to design states of matter for use in superconductors or quantum computers. The paper provides a revealing look at the intricate and fascinating world of quantum entanglement, and an important step toward a modern reclassification of all phases of matter. Continue reading "Quantum Entanglement is Creating New Classifications of All Phases of Matter" » Astronomers have come to realize that the process of star formation, once thought to consist essentially of just the simple coalescence of material by gravity, occurs in a complex series of stages. As the gas and dust in giant molecular clouds comes together into stars, dramatic outflowing jets of material develop around each, as do circumstellar disks (possibly pre-planetary in nature). Other features are present as well: Astronomers in the 1960s were amazed to discover that these star-forming regions sometimes produce natural masers (masers are the bright, radio wavelength analogs of lasers). Clouds of water vapor or methanol vapor in regions of active star formation generate some of the most spectacular masers. Continue reading "Holiday Weekend Image: A Spectacular Masar" » The Fermi paradox is the apparent contradiction between high estimates of the probability of the existence of extraterrestrial civilizations and the lack of evidence for, or contact with, such civilizations. As Enrico Fermi asked if the Universe is conducive to intelligent life, “Where is everybody?” Continue reading "Advanced ET Civilizations May Be Impossible to Detect (Holiday Weekend Feature)" » "Even if there is only 1 intelligent civilization per galaxy where you have over 100 billion stars per galaxy with some galaxies sporting nearer to a trillion stars. There are over 100 billion galaxies in the visible universe, maybe more. So even assuming you have only 1 species per galaxy, that's still 100 billion x 100 billion possible life sustaining solar systems. Which is probably a small estimate. We know that the building blocks of life are present more or less everywhere in the universe." View Original Post Continue reading "Comment of the Day: Advanced ET Civilizations May Be Impossible to Detect (Holiday Weekend Feature)" »
<urn:uuid:b43b1f74-af43-45d9-ab1f-3005ed691196>
CC-MAIN-2014-15
http://www.dailygalaxy.com/my_weblog/2012/12/page/2/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609526102.3/warc/CC-MAIN-20140416005206-00560-ip-10-147-4-33.ec2.internal.warc.gz
en
0.917455
1,100
3.5
4
Download Research Tools One of the responsibilities for us as researchers is to have the courage to challenge accepted "truths" and to seek out new insights. Richard Feynman was a physicist who not only epitomized both of these qualities in his research but also took enormous pleasure in communicating the ideas of physics to students. Feynman won the Nobel Prize for his computational toolkit that we now call Feynman Diagrams. The techniques he developed helped the physics community make sense of Quantum Electrodynamics (QED) after the war, when the entire community was in a state of confusion about how to handle the infinities that appeared all over the place when one tried to make a perturbative expansion in the coupling. Feynman was the subject of a recent TEDxCaltech conference, fittingly called, "Feynman's Vision: The Next 50 Years." The event was organized in recognition of the 50-year anniversary of Feynman's visionary talk, "There's Plenty of Room at the Bottom," in which he set out a vision for nanoscience that is only now beginning to be realized. It is also 50 years since he gave his revolutionary "Feynman Lectures on Physics," which educated generations of physicists. I had the honor of speaking about Feynman's contributions to computing, from his days at Los Alamos during the war, his Nobel Prize winning computational toolkit (Feynman Diagrams), and his invention of quantum computing, By striving to think differently, he truly changed the world. The following are some highlights from my presentation. Parallel Computing Without Computers Feynman worked on the Manhattan Project at Los Alamos in the 1940s with Robert Oppenheimer, Hans Bethe, and Edward Teller. In order to make an atom bomb from the newly-discovered trans-uranic element, Plutonium, it was necessary to generate a spherical compression wave to compress the Plutonium to critical mass for the chain reaction to start. It was, therefore, necessary to calculate how to position explosive charges in a cavity to generate such a compression wave; these calculations were sufficiently complex that they had to be done numerically. The team assigned to perform these calculations was known as the "IBM team," but it should be stressed that this was in the days before computers and the team operated on decks of cards with adding machines, tabulators, sorters, collators, and so on. The problem was that the calculations were taking too long, so Feynman was put in charge of the IBM team. Feynman immediately discovered that because of the obsession with secrecy at Los Alamos, the team members had no idea of the significance of their calculations or why they were important for the war effort. He went straight to Oppenheimer and asked for permission to brief the team about the importance of their implosion calculations. He also discovered a way to speed up the calculations. By assigning each problem to a different colored deck of cards, the team could work on more than one problem at once. While one deck was using one of the machines for one stage of the calculation, another deck could be using a different machine for a different stage of its calculation. In essence, this is a now-familiar technique of parallel computing—the pipeline parallelism familiar from the Cray vector supercomputers, for example. The result was a total transformation. Instead of completing only three problems in nine months, the team was able to complete nine problems in three months! Of course, this led to a different problem when management reasoned that it should be possible to complete the last calculation needed for the Trinity test in less than a month. To meet this deadline, Feynman and his team had to address the more difficult problem of breaking up a single calculation into pieces that could be performed in parallel. My next story starts in 1948 at the Pocono Conference where all the great figures of physics—Niels Bohr, Paul Dirac, Robert Oppenheimer, Edward Teller, and so on—had assembled to try to understand how to make sense of the infinities in QED. Feynman and Schwinger were the star speakers, but Feynman was unable to make his audience understand how he did his calculations. His interpretation of positrons as negative energy electrons moving backwards in time was just too hard for them to accept. After the conference, Feynman was in despair and later said, "My machines came from too far away." Less than a year later, Feynman had his triumph. At an American Physical Society meeting in New York, Murray Slotnick talked about some calculations he had done with two different meson-nucleon couplings. He had shown that these two couplings indeed gave different answers. After Slotnick's talk, Oppenheimer got up from the audience and said that Slotnick's calculations must be wrong since they violated Case's Theorem. Poor Slotnick had to confess that he had never heard of Case's Theorem and Oppenheimer informed him that he could remedy his ignorance by listening to Professor Case present his theorem the following day. That night, Feynman couldn't sleep so he decided to re-do Slotnick's calculations by using his diagram techniques. The next day at the conference, Feynman sought out Slotnick, told him what he had done, and suggested they compare results. "What do you mean you worked it out last night?" Slotnick responded. "It took me six months!" As the two compared answers, Slotnick asked, "What is that Q in there, that variable Q?" Feynman replied that the Q was the momentum transfer as the electron was deflected by different angles. "Oh," Slotnick replied. "I only have the limiting value as Q approaches zero. For forward scattering." Feynman said, "No problem, we can just set Q equal to zero in my formulas!" Feynman found that he had obtained the same answer as Slotnick. After Case had presented his theorem, Feynman stood up at the back of the audience and said, "Professor Case, I checked Slotnick's calculations last night and I agree with him, so your theorem must be wrong." And then he sat down. That was a thrilling moment for Feynman, like winning the Nobel Prize—which he did much later—because he was now sure that he had achieved something significant. It had taken Slotnick six months to do the case of zero momentum transfer while Feynman had been able to complete the calculation for arbitrary momentum transfer in one evening. The computational toolkit that we now call Feynman Diagrams have now penetrated to almost all areas of physics and his diagrams appear on the blackboards of physicists all around the world. This toolkit is undoubtedly Feynman's greatest gift to physics and the story perfectly illustrates Feynman's preference for concrete, detailed calculation rather than reliance on more abstract theorems. The Physics of Computation At the invitation of his friend Ed Fredkin, Feynman delivered a keynote lecture at "The Physics of Computation" Conference at MIT in 1981. Feynman considered the problem of whether it was possible to perform an accurate simulation of Nature on a classical computer. As Nature ultimately obeys the laws of quantum mechanics, the problem reduces to simulating a quantum mechanical system on a classical computer. Because of the nature of quantum objects like electrons, truly quantum mechanical calculations on a classical computer rapidly become impractical for more than a few 10's of electrons. Feynman then proceeded to consider a new type of computer based on quantum mechanics: a quantum computer. He realized that this was a new type of computer: "Not a Turing machine, but a machine of a different kind." Interestingly, Feynman did not go on to explore the different capabilities of quantum computers but simply demonstrated how you could use them to simulate true quantum systems. By his presence at the conference, Feynman stimulated interest both in the physics of computation and in quantum computing. At this conference 30 years later, we heard several talks summarizing progress towards actually building a quantum computer. In the last five years of his life, Feynman gave lectures on computation at Caltech, initially with colleagues Carver Mead and John Hopfield, and for the last three years by himself. I was fortunate enough to be asked by Feynman to write up his "Lectures on Computation." The lectures were a veritable tour de force and were probably a decade ahead of their time. Feynman considered the limits to computation due to mathematics, thermodynamics, noise, silicon engineering, and quantum mechanics. In the lectures, he also gave his view about the field of computer science: He regarded science as the study of natural systems and classified computer science as engineering since it studied man-made systems. Inspiring Later Generations Feynman said that he started out very focused on physics and only broadened his studies later in life. There are several fascinating biographies of Feynman but the one I like best is No Ordinary Genius by Christopher Sykes. This is a wonderful collection of anecdotes, interview, and articles about Feynman and his wide range of interests—from physics, to painting, to bongo drums and the Challenger Enquiry. Feynman was a wonderful inspiration to the entire scientific community and his enjoyment of and enthusiasm for physics is beautifully captured in the TV interview, "The Pleasure of Finding Things Out," produced by Christopher Sykes for the BBC. Feynman is forever a reminder that we must try to think differently in order to innovate and succeed. —Tony Hey, corporate vice president of the External Research Division of Microsoft Research
<urn:uuid:4a20df47-7d0e-4e38-94e3-b8d9585c88af>
CC-MAIN-2014-15
http://blogs.msdn.com/b/msr_er/archive/2011/02/04/celebrating-richard-feynman-at-tedxcaltech.aspx?Redirected=true
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206118.10/warc/CC-MAIN-20140423032006-00368-ip-10-147-4-33.ec2.internal.warc.gz
en
0.970837
1,992
3.640625
4
||It has been suggested that Josephson energy be merged into this article. (Discuss) Proposed since January 2013.| The Josephson effect is the phenomenon of supercurrent—i.e. a current that flows indefinitely long without any voltage applied—across a device known as a Josephson junction (JJ), which consists of two superconductors coupled by a weak link. The weak link can consist of a thin insulating barrier (known as a superconductor–insulator–superconductor junction, or S-I-S), a short section of non-superconducting metal (S-N-S), or a physical constriction that weakens the superconductivity at the point of contact (S-s-S). The Josephson effect is an example of a macroscopic quantum phenomenon. It is named after the British physicist Brian David Josephson, who predicted in 1962 the mathematical relationships for the current and voltage across the weak link. The DC Josephson effect had been seen in experiments prior to 1962, but had been attributed to "super-shorts" or breaches in the insulating barrier leading to the direct conduction of electrons between the superconductors. The first paper to claim the discovery of Josephson's effect, and to make the requisite experimental checks, was that of Philip Anderson and John Rowell. These authors were awarded patents on the effects that were never enforced, but never challenged. Before Josephson's prediction, it was only known that normal (i.e. non-superconducting) electrons can flow through an insulating barrier, by means of quantum tunneling. Josephson was the first to predict the tunneling of superconducting Cooper pairs. For this work, Josephson received the Nobel prize in physics in 1973. Josephson junctions have important applications in quantum-mechanical circuits, such as SQUIDs, superconducting qubits, and RSFQ digital electronics. The basic equations governing the dynamics of the Josephson effect are - (superconducting phase evolution equation) - (Josephson or weak-link current-phase relation) where U(t) and I(t) are the voltage and current across the Josephson junction, is the "phase difference" across the junction (i.e., the difference in phase factor, or equivalently, argument, between the Ginzburg–Landau complex order parameter of the two superconductors composing the junction), and Ic is a constant, the critical current of the junction. The critical current is an important phenomenological parameter of the device that can be affected by temperature as well as by an applied magnetic field. The physical constant is the magnetic flux quantum, the inverse of which is the Josephson constant. The three main effects predicted by Josephson follow from these relations: - The DC Josephson effect - This refers to the phenomenon of a direct current crossing from the insulator in the absence of any external electromagnetic field, owing to tunneling. This DC Josephson current is proportional to the sine of the phase difference across the insulator, and may take values between and . - The AC Josephson effect - With a fixed voltage across the junctions, the phase will vary linearly with time and the current will be an AC current with amplitude and frequency . The complete expression for the current drive becomes . This means a Josephson junction can act as a perfect voltage-to-frequency converter. - The inverse AC Josephson effect - If the phase takes the form , the voltage and current will be The DC components will then be Hence, for distinct AC voltages, the junction may carry a DC current and the junction acts like a perfect frequency-to-voltage converter. The Josephson effect has found wide usage, for example in the following areas: - SQUIDs, or superconducting quantum interference devices, are very sensitive magnetometers that operate via the Josephson effect. They are widely used in science and engineering. - In precision metrology, the Josephson effect provides an exactly reproducible conversion between frequency and voltage. Since the frequency is already defined precisely and practically by the caesium standard, the Josephson effect is used, for most practical purposes, to give the definition of a volt (although, as of July 2007, this is not the official BIPM definition). - Single-electron transistors are often constructed of superconducting materials, allowing use to be made of the Josephson effect to achieve novel effects. The resulting device is called a "superconducting single-electron transistor." The Josephson effect is also used for the most precise measurements of elementary charge in terms of the Josephson constant and von Klitzing constant which is related to the quantum Hall effect. - RSFQ digital electronics is based on shunted Josephson junctions. In this case, the junction switching event is associated to the emission of one magnetic flux quantum that carries the digital information: the absence of switching is equivalent to 0, while one switching event carries a 1. - Josephson junctions are integral in superconducting quantum computing as qubits such as in a flux qubit or others schemes where the phase and charge act as the conjugate variables. - Superconducting tunnel junction detectors (STJs) may become a viable replacement for CCDs (charge-coupled devices) for use in astronomy and astrophysics in a few years. These devices are effective across a wide spectrum from ultraviolet to infrared, and also in x-rays. The technology has been tried out on the William Herschel Telescope in the SCAM instrument. - Quiterons and similar superconducting switching devices. - Josephson effect has also been observed in SHeQUIDs, the superfluid helium analog of a dc-SQUID. |Wikimedia Commons has media related to Josephson effect.| - Andreev reflection - Fractional vortices - Ginzburg–Landau theory - Macroscopic quantum phenomena - Macroscopic quantum self-trapping - Pi Josephson junction - Varphi Josephson junction - Quantum computer - Quantum gyroscope - Rapid single flux quantum (RSFQ) - Superconducting tunnel junction - Zero-point energy - Josephson, B. D., "Possible new effects in superconductive tunnelling," Physics Letters 1, 251 (1962) doi:10.1016/0031-9163(62)91369-0 - Josephson, B. D. (1974). "The discovery of tunnelling supercurrents". Rev. Mod. Phys. 46 (2): 251–254. Bibcode:1974RvMP...46..251J. doi:10.1103/RevModPhys.46.251. - Josephson, Brian D. (December 12, 1973). "The Discovery of Tunneling Supercurrents (Nobel Lecture)". - Anderson, P W; Rowell, J M (1963). "Probable Observation of the Josephson Tunnel Effect". Phys. Rev. Letters 10: 230. Bibcode:1963PhRvL..10..230A. doi:10.1103/PhysRevLett.10.230. - The Nobel prize in physics 1973, accessed 8-18-11 - Anderson, P. W., and Dayem, A. H., "Radio-frequency effects in superconducting thin film bridges," Physical Review Letters 13, 195 (1964), doi:10.1103/PhysRevLett.13.195 - Dawe, Richard (28 October 1998). "SQUIDs: A Technical Report - Part 3: SQUIDs" (website). http://rich.phekda.org. Retrieved 2011-04-21. - Barone, A.; Paterno, G. (1982). Physics and Applications of the Josephson Effect. New York: John Wiley & Sons. ISBN 0-471-01469-9. - International Bureau of Weights and Measures (BIPM), SI brochure, section 2.1, accessed 4-17-12 - Fulton, T.A.; et al. (1989). "Observation of Combined Josephson and Charging Effects in Small Tunnel Junction Circuits". Physical Review Letters 63 (12): 1307–1310. Bibcode:1989PhRvL..63.1307F. doi:10.1103/PhysRevLett.63.1307. PMID 10040529. - Bouchiat, V.; Vion, D.; Joyez, P.; Esteve, D.; Devoret, M. H. (1998). "Quantum coherence with a single Cooper pair". Physica Scripta T 76: 165. Bibcode:1998PhST...76..165B. doi:10.1238/Physica.Topical.076a00165. - Physics Today, Superfluid helium interferometers, Y. Sato and R. Packard, October 2012, page 31
<urn:uuid:33de6a4b-ee88-4d1e-badf-8d0b9e49eacb>
CC-MAIN-2014-15
http://en.wikipedia.org/wiki/Josephson_junction
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609536300.49/warc/CC-MAIN-20140416005216-00281-ip-10-147-4-33.ec2.internal.warc.gz
en
0.874214
1,892
3.953125
4
Technology Research News Give an electron two paths to get to one location and it will usually take both. This fact of quantum physics plays a leading role in a computer architecture that could replace today's chip technology when it reaches its limits in a decade or so. According to the laws of quantum physics, electrons are waves as well as particles. Like ocean waves, where two crests meet they reinforce each other and where a crest and trough meet they cancel each other out. Researchers at University of Missouri at Rolla have devised a scheme for using electron wave interference to represent the ones and zeros of digital Traditional electronic computers use combinations of transistors, which are tiny electronic switches, as the logic units that perform the binary arithmetic at the heart of digital computing. Electron wave computers would use networks of microscopic wire rings that form the two paths for the electron waves to follow, said Cheng-Hsiao Wu, a professor of electrical and computer engineering at the University of Missouri at Rolla. "You do not need transistors to control the flow of charge if all the devices involved are very small and at low temperature," said Wu. The researchers' proposal involves using modified forms of Aharonov-Bohm rings, which are used in basic physics research, to form the logic gates of computers. Aharonov-Bohm rings are circles of extremely thin wire and are commonly made several times smaller than a red blood cell. Due to their wave nature, electrons entering the Aharonov-Bohm rings travel in both directions at once, meeting -- and reinforcing each other -- at the Using a magnetic field perpendicular to the ring, researchers can speed up or slow down the electron wave traveling in one side of the ring, throwing the waves in the two sides out of sync and causing the waves to cancel each other out when they meet at the other end. The reinforced waves and the canceled waves could represent the ones and zeros of computing, according Aharonov-Bohm rings have an input and an output terminal. The researchers' scheme calls for making three- and four-terminal Aharonov-Bohm rings. Their work shows that three-terminal rings could be combined to form IF-THEN, XOR, OR, AND and INVERTER logic units. These logic units could, in turn, be combined to form half adders and full adders. A half adder adds two binary numbers but cannot carry, and a full adder includes the carry function. A single, four-terminal Aharonov-Bohm ring could also be used as a half adder, said Wu. "It replaces eight transistors for the same function." And two connected four-terminal Aharonov-Bohm rings could serve as a full adder. "This replaces about two dozen transistors in traditional microelectronic circuits," he said. In addition to the potential for making smaller, and therefore faster, computer circuits, electron wave computers could solve certain problems faster than even the fastest ordinary computer by examining all of the possible solutions to a problem at once, according to Wu. Electron wave interference could be used to make massively parallel processing computers, he said. "Millions of inputs enter a large network [of rings] simultaneously with desirable outputs when the waves arrive at the output terminals. This is similar to optical computing." Optical computers use light waves that reinforce and cancel each other out. Last year, researchers at the University of Rochester demonstrated an optical computer running a quantum search algorithm. The electron wave scheme is an idea worth trying, said Ian Walmsley, a professor of experimental physics at the University of Oxford and a professor of optics at the University of Rochester. "The nice thing about electrons is that [their] wavelengths are inherently smaller than optical wavelengths, so the whole machine can be smaller. At present I see the advance as a technical one rather than a fundamental one," he added. "It's a very neat idea but... completely theoretical," said Mike Lea, a professor of physics at the University of London. "I'd be quite skeptical about claims without at least some analysis of the likely practicalities based on real experiments," he said. The researchers are working out the physics for larger networks of Aharonov-Bohm rings, said Wu. "I would like to convince experimentalists elsewhere to simply extend the original Aharonov-Bohm effect to three or four terminals. I promise nice results will come out of such a simple extension," he said. Given that today's semiconductor technology is likely to reach its limits by the year 2015, researchers and engineers should have a good idea of how to build devices smaller than 10 nanometers by then, said Wu. At that point, electron wave computing could be a contender for the next generation computer architecture, he said. Wu's research colleague was Diwakar Ramamurthy. They published the research in the February 15, 2002 issue of the journal Physical Review B. The research was funded by the university. Timeline: 13 years TRN Categories: Quantum Computing and Communications; Integrated Story Type: News Related Elements: Technical paper, "Logic Functions from Three-Terminal Quantum Resistor Networks for Electron Wave Computing," Physical Review B, February 15, 2002 Electron waves compute Porous glass makes Internet map improves Magnets channel biomatter Research News Roundup Research Watch blog View from the High Ground Q&A How It Works News | Blog Buy an ad link
<urn:uuid:6f05c02e-7e7f-4134-930a-7b1986f328d0>
CC-MAIN-2014-15
http://www.trnmag.com/Stories/2002/040302/Electron_waves_compute_040302.html
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223202774.3/warc/CC-MAIN-20140423032002-00018-ip-10-147-4-33.ec2.internal.warc.gz
en
0.905964
1,202
3.921875
4
Introduced in Alan Turing 's 1936 paper On computable numbers, with an application to the Entscheidungsproblem , a universal Turing machine is a mathematical idealisation of a general purpose computer . Able to act, with appropriate input, as literally any other possible Turing Machine , Turing's invention, essentially the concept of a general purpose cpu executing a stored program , was probably the largest single step taken in the development of the computer, and is often regarded as the start of computer science A Turing machine (TM) consists of a tape, a head which can mark and erase the tape, and a set of states. Depending on whether the tape is currently marked, and which state is occupied, the TM will erase or mark the tape or not, and move it one square left or right, at which point the next state kicks in. Additionally, there is a state which causes the TM to halt, if it is reached. The tape is considered to be of arbitrary length and composed of discrete units which are accessible to the head in strict order, singly and wholly - that is the tape is an idealised one-bit erasable paper tape which never stretches, breaks, folds, runs out, or breaks other rules which are harder to think of. The critical thing is that though the tape may be arbitrarily large, each step of the operation of a TM is completely determined by a finite number of simple and unambiguous rules. It is completely mechanical in its operation, and always behaves in the same way for any particular state and input. These rules defining a TM (the set of states) can be written out in a standard form as marks on a tape. The interpretation of such an on-tape representation of a TM is then a mechanical procedure which can be realised by some TM with a suitable set of states. A universal Turing machine (UTM) is a particular TM so constructed that its tape can encode any TM whatsoever, with the guarantee that the UTM will then do just what the encoded TM would do. Suppose we have a machine M, then its output with initial tape t can be written M(t). Then a UTM U is a TM such that: for all outputs Mi(tj) there's some ei,j such that U(ei,j) = Mi(tj) We'd call ei,j the encoding of Mi(tj). It's also required that the UTM can recognise input that is not a valid encoding of a TM and produce a predetermined response when this occurs. Turing proved the existence of such UTM's by specifying one in his paper - it turned out not to be very complex - and showing it had the characteristic required, of replicating the behaviour of an arbitrary TM which is encoded on its tape. This is the essence of the modern computer, that given sufficient storage it can carry out an arbitrary program, encoded into some specific "language". The choice of a particular UTM defines a particular language. Turing's insight was that an algorithm, when encoded, is just so much data that can then be operated on by another algorithm. The idea of encoding a TM as input for execution by a UTM is pretty much all you need for the general idea of a computer program. The fact that a UTM can emulate any TM at all makes it easy to establish fundamental equivalences between various computational methods. If a particular method can produce a UTM, then it's obvious it can compute anything computable by an arbitrary TM. Such a formalism or language is said to be Turing complete. Specifications for UTM's have been written in formalisms as diverse as XSLT, sendmail.cf and cellular automata such as Conway's game of life. This property of universality shifts the competition from what can be computed to the number of steps and amount of input required. No matter how featureful, elegant and concise the programming language you construct, whatever computations it can perform can be done in sendmail.cf or brainfuck. Universality has been of interest to some heterodox physicists, such as Ed Fredkin and Steven Wolfram. Fredkin, on a suggestion of Feynman's, has been investigating the possibility of using cellular automata as a physics model and suggests suitable automata must be both universal (i.e. Turing complete) and reversible. Wolfram (also big on CA) sees in the UTM an upper bound to the complexity of the makeup of the universe. David Deutsch has proposed that "every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means", and has attempted to extend the idea of a UTM to quantum computing. Mathematician Gregory Chaitin has used the UTM as a building block in his algorithmic information theory, refining the notion by specifying that the encoding for the TM's must instruct the UTM how long they are (Chaitin says they are 'self-delimiting') and using them to define the algorithmic complexity of a string relative to a given UTM - the length of the shortest input that will cause the UTM to output that string - and to formulate his bizarre constant Omega - the probability, for some self-delimiting UTM, that it will halt with random input. Chaitin imagines flipping a coin to determine the state of each successive bit of the unread tape, as the UTM reads in its program. It's required to be self-delimiting so that the UTM knows when to stop reading and Chaitin knows when to stop flipping coins. Gregory Chaitin, Foundations of Mathematics at: For Fredkin, see:
<urn:uuid:be5b24b2-1013-4b66-8ff9-ea0e5a257bc6>
CC-MAIN-2014-15
http://everything2.com/title/Universal+Turing+Machine
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223203235.2/warc/CC-MAIN-20140423032003-00354-ip-10-147-4-33.ec2.internal.warc.gz
en
0.937898
1,180
3.875
4
Rise of the Boson-Sampling Computer OXFORD, England, and ST. LUCIA, Australia, Jan. 2, 2013 — Despite the widespread research on quantum computing, nobody has built a machine that uses quantum mechanics to solve a computational problem faster than a classical silicon-based computer. Now scientists from universities in England and Australia have developed device called a boson sampling computer that rivals a quantum computer. Although boson sampling computers are not believed to have all the problem solving ability of a full quantum computer, they can solve some problems faster than today’s machines, and can be much easier to build experimentally with existing photonic technology. The device could pave the way to larger devices that could offer the first definitive quantum-enhanced computation. Boson sampling requires three main ingredients: single bosons, the large-scale linear manipulation of bosons, and single-boson-sensitive detectors. The 8-cm-long silica-on-silicon photonic chip in the center of the picture served as the four-photon quantum boson sampling machine. Arrays of single-mode fibers are glued to the left and right sides of the chip. For viewing purposes, a red laser is coupled into two of the single-mode fibers (right side of picture), which illuminate a portion of the on-chip interferometric network. For the boson sampling experiment, the red laser was replaced with single-photon sources. There are five thermal phase shifting elements on top of the chip, although they were not used in this experiment. This image relates to the paper by Dr. Justin Spring and colleagues. Courtesy of Dr. James C. Gates. Photons are identical at a fundamental level, exhibiting a strong quantum level of entanglement. If two sufficiently identical photons come together, they behave in a connected way — almost as if they clump together. When scaled up to multiple input photons, these entanglements cause the outputs of a boson-sampling circuit to clump together in a characteristic way, predictable by quantum mechanics but difficult to calculate using conventional computers. In their experiment, Oxford University’s Justin Spring and colleagues used single photons and quantum interference to perform a calculation that is believed to be very difficult on a classical computer. “Boson sampling provides a model of quantum-enhanced computation that is experimentally feasible with existing photonic technology,” Spring said. “Future generations of boson sampling machines will benefit from ongoing advances in integrated photonics.” The experiment was performed on a photonic chip developed by professor Peter Smith and Dr. James Gates from the Optoelectronics Research Center at the University of Southampton. The logo of the Quantum Technology Lab spelled out with the laser beams used in the BosonSampling device. This image relates to the paper by Dr. Matthew Broome and colleagues. Courtesy of Alisha Toft. “The chip offers a scalable route … to build large linear systems required for larger boson sampling machines,” Gates said. “If one is going to eventually need to move ‘on chip’ with more complex boson sampling machines, there is obvious benefit in building the proof-of-principle devices ‘on chip’ as well. The move to optical processing on a chip format can be likened to the shift to integrated silicon chips in electronics.” In a separate experiment, Dr. Matthew Broome and colleagues at the University of Queensland built a device they called BosonSampling to determine whether quantum computers are the only way to perform efficient computations, or whether conventional computers can solve the problem almost as quickly. The device implemented a form of quantum computation where a handful of single photons were sent through a photonic network and then researchers sampled how often they exited the network outputs. “Although this sounds simple, for large devices and many photons, it becomes extremely difficult to predict the outcomes using a conventional computer, whereas our measurements remain straightforward to do,” Broome said. The device — proposed in late 2010 by associate professor Scott Aaronson and Dr. Alex Arkhipov of MIT — will provide strong evidence that quantum computers do indeed have an exponential advantage over conventional computers. Dr. Matthew Broome at work on the BosonSampling device. This image relates to the paper he wrote in collaboration with colleagues. Courtesy of Alisha Toft. “Scott and Alex’s proposal was a 94-page mathematical tour de force,” said experimental team leader Andrew White of the University of Queensland. “We genuinely didn’t know if it would implement nicely in the lab, where we have to worry about real-world effects like lossy circuits, and imperfect single photon sources and detectors.” The BosonSampling device behaves as expected, paving the way for larger and larger instances of this experiment. The prediction is that, with just tens of photons, it can outperform any of today’s supercomputers. “The first proof-of-principle demonstrations of BosonSampling have been shown — even if only with three photons, rather than the 30 or so required to outperform a classical computer,” Aaronson said. “I did not expect this to happen so quickly.” The studies appeared in Science ) and (doi: 10.1126/science.1231440 For more information, visit: www.ox.ac.uk
<urn:uuid:1ca368cc-c61b-4327-adc6-1487f4aa34ec>
CC-MAIN-2014-15
http://photonics.com/Article.aspx?AID=52670
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223202457.0/warc/CC-MAIN-20140423032002-00348-ip-10-147-4-33.ec2.internal.warc.gz
en
0.921037
1,134
3.796875
4
Scientists Score New Victory Over Quantum Uncertainty ScienceDaily (Feb. 26, 2012) — Most people attempt to reduce the little uncertainties of life by carrying umbrellas on cloudy days, purchasing automobile insurance or hiring inspectors to evaluate homes they might consider purchasing. For scientists, reducing uncertainty is a no less important goal, though in the weird realm of quantum physics, the term has a more specific meaning. For scientists working in quantum physics, the Heisenberg Uncertainty Principle says that measurements of properties such as the momentum of an object and its exact position cannot be simultaneously specified with arbitrary accuracy. As a result, there must be some uncertainty in either the exact position of the object, or its exact momentum. The amount of uncertainty can be determined, and is often represented graphically by a circle showing the area within which the measurement actually lies. Over the past few decades, scientists have learned to cheat a bit on the Uncertainty Principle through a process called "squeezing," which has the effect of changing how the uncertainty is shown graphically. Changing the circle to an ellipse and ultimately to almost a line allows one component of the complementary measurements -- the momentum or the position, in the case of an object -- to be specified more precisely than would otherwise be possible. The actual area of uncertainty remains unchanged, but is represented by a different shape that serves to improve accuracy in measuring one property. This squeezing has been done in measuring properties of photons and atoms, and can be important to certain high-precision measurements needed by atomic clocks and the magnetometers used to create magnetic resonance imaging views of structures deep inside the body. For the military, squeezing more accuracy could improve the detection of enemy submarines attempting to hide underwater or improve the accuracy of atom-based inertial guidance instruments. Now physicists at the Georgia Institute of Technology have added another measurement to the list of those that can be squeezed. In a paper appearing online February 26 in the journal Nature Physics, they report squeezing a property called the nematic tensor, which is used to describe the rubidium atoms in Bose-Einstein condensates, a unique form of matter in which all atoms have the same quantum state. The research was sponsored by the National Science Foundation (NSF). "What is new about our work is that we have probably achieved the highest level of atom squeezing reported so far, and the more squeezing you get, the better," said Michael Chapman, a professor in Georgia Tech's School of Physics. "We are also squeezing something other than what people have squeezed before." Scientists have been squeezing the spin states of atoms for 15 years, but only for atoms that have just two relevant quantum states -- known as spin ½ systems. In collections of those atoms, the spin states of the individual atoms can be added together to get a collective angular momentum that describes the entire system of atoms. In the Bose-Einstein condensate atoms being studied by Chapman's group, the atoms have three quantum states, and their collective spin totals zero -- not very helpful for describing systems. So Chapman and graduate students Chris Hamley, Corey Gerving, Thai Hoang and Eva Bookjans learned to squeeze a more complex measure that describes their system of spin 1 atoms: nematic tensor, also known as quadrupole. Nematicity is a measure of alignment that is important in describing liquid crystals, exotic magnetic materials and some high temperature superconductors. "We don't have a spin vector pointing in a particular direction, but there is still some residual information in where this collection of atoms is pointing," Chapman explained. "That next higher-order description is the quadrupole, or nematic tensor. Squeezing this actually works quite well, and we get a large degree of improvement, so we think it is relatively promising." Experimentally, the squeezing is created by entangling some of the atoms, which takes away their independence. Chapman's group accomplishes this by colliding atoms in their ensemble of some 40,000 rubidium atoms. "After they collide, the state of one atom is connected to that of the other atom, so they have been entangled in that way," he said. "This entanglement creates the squeezing." Reducing uncertainty in measuring atoms could have important implications for precise magnetic measurements. The next step will be to determine experimentally if the technique can improve the measurement of magnetic field, which could have important applications. "In principle, this should be a straightforward experiment, but it turns out that the biggest challenge is that magnetic fields in the laboratory fluctuate due to environmental factors such as the effects of devices such as computer monitors," Chapman said. "If we had a noiseless laboratory, we could measure the magnetic field both with and without squeezed states to demonstrate the enhanced precision. But in our current lab environment, our measurements would be affected by outside noise, not the limitations of the atomic sensors we are using." The new squeezed property could also have application to quantum information systems, which can store information in the spin of atoms and their nematic tensor. "There are a lot of things you can do with quantum entanglement, and improving the accuracy of measurements is one of them," Chapman added. "We still have to obey Heisenberg's Uncertainty Principle, but we do have the ability to manipulate it." Hamley, C. D., C. S. Gerving, et al. (2012). "Spin-nematic squeezed vacuum in a quantum gas." Nat Phys advance online publication. The standard quantum limit of measurement uncertainty can be surpassed using squeezed states, which minimize the uncertainty product in Heisenberg’s relation by reducing the uncertainty of one property at the expense of another1. Collisions in ultracold atomic gases have been used to induce quadrature spin squeezing in two-component Bose condensates 2, 3, for which the complementary properties are the components of the total spin vector. Here, we generalize this finding to a higher-dimensional spin space by measuring squeezing in a spin-1 Bose condensate. Following a quench through a quantum phase transition, we demonstrate that spin-nematic quadrature squeezing improves on the standard quantum limit by up to 8–10 dB—a significant increase on previous measurements. This squeezing is associated with negligible occupation of the squeezed modes, and is analogous to optical two-mode vacuum squeezing. The observation has implications for continuous variable quantum information and quantum-enhanced magnetometry. Delenda est Carthago
<urn:uuid:cedc72a3-8b07-49a1-94b8-e748fac2647f>
CC-MAIN-2014-15
http://www.atheistfoundation.org.au/forums/showthread.php?t=13394
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609524259.30/warc/CC-MAIN-20140416005204-00554-ip-10-147-4-33.ec2.internal.warc.gz
en
0.941785
1,342
3.796875
4
A research team from the Institut CatalÓ de Nanotecnologia (ICN), in Barcelona, has demonstrated a device that induces electron spin motion without net electric currents, a key step in developing the spin computers of the future. The results are published in the Dec 17 issue of the journal Science. The authors are Marius V. Costache and Sergio O. Valenzuela, an ICREA Professor who is leader of the Physics and Engineering of Nanodevices Group at ICN. Spintronics is a branch of electronics that aims to use the electron spin rather than its charge to transport and store information. The electron spin comes in two forms, "spin up" or "spin down", and would allow significantly more data to be stored and analyzed than is possible with current electronics. Moreover, spin computers would be able to process vast amounts of information while using less energy and generating much less heat than conventional computers. Advances in spintronics have already impacted commercial products, enabling a huge increase in storage capacity of magnetic hard disks. However, the devices comprise ferromagnetic multilayers that act as spin filters and require conventional electrical charge currents in order to work. To garner the full potential of spintronics, further fundamental advances are urgently needed. Researchers working in this field face a key challenge: how to generate and control spins without the simultaneous generation of electric current, and the resultant energy losses? This would enable not just data storage, but calculations to be realized directly using spin states. As reported in the journal Science, Prof. Valenzuela and Dr. Costache have proposed and experimentally demonstrated a ratchet concept to control the spin motion. In analogy to a ratchet wrench, which provides uniform rotation from oscillatory motion, such ratchets achieve directed spin transport in one direction, in the presence of an oscillating signal. Most important, this signal could be an oscillatory current that results from environmental charge noise; thus future devices based on this concept could function by gathering energy from the environment. The efficiency of the ratchet can be very high. Reported results show electron polarizations of the order of 50%, but they could easily exceed 90% with device design improvements. The spin ratchet, which relies on a single electron transistor with a superconducting island and normal metal leads, is able to discriminate the electron spin, one electron at a time. The devices can also function in a "diode" regime that resolves spin with nearly 100% efficacy and, given that they work at the single-electron level, they could be utilized to address fundamental questions of quantum mechanics in the solid state or to help prepare the path for ultrapowerful quantum or spin computers. The main drawback of the devices is that they work at low temperature. However, this does not represent a problem for quantum computing applications as solid state implementations of quantum computers will most likely require similar working conditions. Future research at the ICN will focus on increasing the spin ratchet efficiency and testing different ratchet protocols to implement a working device at room temperature. CATALAN INSTITUTE OF NANOTECHNOLOGY (ICN) The Catalan Institute of Nanotechnology (ICN) is a private foundation created in 2003 and forms part of CERCA, the Network of Research Centers launched by the Catalan Government as a key plank of the long-term strategy to foster the development of a knowledge-based economy. The ICN┤s multicultural team of scientists, representing over 20 nationalities, aims to produce cutting-edge science and develop next-generation technologies by investigating the new properties of matter that arise from the fascinating behavior at the nanoscale. Research is devoted on one side to the study and understanding of fundamental physical phenomena associated to state variables (electrons, spin, phonons, photons, plasmons, etc.), the investigation of new properties derived from tailored nanostructures, and the opening of new routes and fabrication processes for the conception of new nanodevices. On the other side, researchers also explore the state of aggregation at the nanometric scale, the development of nanoproduction methods, synthesis, analysis, and manipulation of aggregates and structures of nanometric dimension, and the development of techniques for characterizing and manipulating nanostructures. These lead to commercially relevant studies such as the functionalization of nanoparticles, the encapsulation of active agents, novel drugs and vaccines, new nanodevices and nanosensors, with applications in health, food, energy, environment, etc. The Institute actively promotes collaboration among scientists from diverse areas of specialization (physics, chemistry, biology, engineering), and trains new generations of scientists, offering studentships, doctoral and post-doctoral positions. Institut Catala de Nanotecnologia Tel: +(34) 93 581 4408, Email: firstname.lastname@example.org, Web: www.icn.cat Communicacion Dept.: Ana de la Osa, email@example.com Principal Researcher: ICREA Prof. Dr. Sergio Valenzuela, SOV@icrea.cat AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
<urn:uuid:c9d71055-57e4-4a41-9ae8-8e52bf81659d>
CC-MAIN-2014-15
http://www.eurekalert.org/pub_releases/2010-12/icdn-ar121410.php
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609533121.28/warc/CC-MAIN-20140416005213-00261-ip-10-147-4-33.ec2.internal.warc.gz
en
0.905118
1,110
3.734375
4
Speed of light From CreationWiki, the encyclopedia of creation science The speed of light in vacuum is held to be constant at 299,792,458 m/s (186,282.397 miles per second). Designated by the symbol "c" (for "constant"), it is a fundamental quantity of the universe. According to special relativity it is the universe's speed limit and it is part of the relation between mass and energy: Some have proposed that the speed of light has decayed since the Creation. While this theory opened the door to scientific solutions to the distant starlight problem, it is not generally accepted by creation scientists. One-Way Speed of Light Sagnac proved that light travels at different speeds depending on its direction and its proximity to the center of Earth's gravity, lending weight to the Anisotropic convention. The one-way speed of light has never been measured. Every known measurement of the speed of light includes reflecting it from another surface. This necessarily changes the nature of light, as it can only be the average of the outbound and inbound leg. Additionally, all electronic means to measure the speed of light cannot themselves operate at the speed of light. This introduces error and constraint into the measurement. If we attempt to embed a signal into a light beam to synchronize two clocks at a distance, the time it takes to both create and to interpret the signal introduce another constraint. In fact, any introduction of a measurement mechanism necessarily constrains the measurement because no measurement mechanism can operate at the speed of light. Einstein understood the primary paradox of the speed of light, as evidenced by the theory of black holes. A black hole's gravity is so strong that light cannot reach escape velocity. However, gravity can only act in this manner between bodies with mass, which necessarily means that photons have mass. Physicists generally do not accept the notion that photons have mass. If they do not, they would be able to escape a black hole, and it would not be black after all. However, if the photon has mass, then it is a particle with mass traveling at the speed of light. For such particles, time stands still. There is no duration between their departure (from an emitting source) and their destination. Essentially departure and arrival are instantaneous. If this is the case with a photon, then there is no such thing as a light-year in space, and the age of the Cosmos cannot be determined using light as a basis. Moreover, the speed of light is a function of distance and duration: speed = distance/time. However, Einstein asserted that time is relative. If this is true then the speed of light is also relative and cannot be constant. To resolve this paradox, Einstein side-stepped it by stipulating that the speed of light is constant without ever proving it. That light requires the same time to traverse the path A > M as for the path B > M is in reality neither a supposition nor a hypothesis about the physical nature of light, but a stipulation which I can make of my own freewill in order to arrive at a definition of simultaneity" (Einstein 1961, p. 23) [emphasis is in the original]. Whenever scientists encounter particle behaviors that defy the speed of light, such as the propensity of particles to instantly share behaviors even across vast distances (e.g. Quantum Entanglement) they still hold to the notion that the speed of light is constant, eliciting the strangest explanations, including the idea that all particles in the universe are connected to all other particles through wormholes. Such oddball theories are the simplest evidence that the "constant" speed of light has been accepted as a reality rather than a stipulation for mathematical purposes. Albert A. Michelson is credited with developing the method for the definitive measurement of the speed of light. In 1902 he published his classic paper on the speed of light, and in 1907 was awarded the Nobel Prize in Physics for this work. Michelson also proposed the standardization of the international unit of length, the meter, using specified wavelengths of light rather than an artifact. For decades the scientific community used Michelson's standardization method, but finally decided to define the SI unit of length according to the speed of light. Today one meter is defined as exactly 1/299,792,458 of the distance that a beam of light travels in one second. Many scientists in the past have speculated about possible changes in the values of one or more physical constants and its implications. These speculations were not always greeted with enthusiasm from the scientific community because the implications of any variation in any constant are enormous: it would introduce changes at astronomical levels in the very fiber of the Universe. Yet the idea never totally died out and was never totally suppressed. Glenn Morton was one of the first persons to put forth a concrete and testable model. He started not from changing fundamental constants, but from another angle. Soon Barry Setterfield came forward with his proposal of variation in the velocity of light. His initial proposal went through several revisions and modifications and creationist publications quoted him widely. Some secular publications also used the information, but the general response was to resist his proposals. Johnson C. Philip from India put forth the same idea in a broader way in 1982 and did some work with the Physics department of Jiwaji University in India. However, he had to abandon the work in 1984 due to the resistance of some non creationist professors. The proposal remains promising, and much work can be done. The resistance remains, especially from non creationists. However, the topic might find a revival, now that the secular community has started to consider the idea of changing fundamental constants. The speed of light has been used to calculate the distance of supernova 1987A from earth with great accuracy, based on observing the time taken for its light to illuminate the Large Magellanic Cloud. It is the standard method for calculating the distance to nearby galaxies. The part of the SN1987A ring perpendicular to the explosion center (as seen from us) was observed to light up about 8 months after the explosion. The light that took a detour via the ring to us was always a ring radius behind the direct light regardless of the speed of light that prevailed during the trip. The ring radius could be calculated to these 8 months times the speed of light as applied to the year 1987, when the measurement was made. Thus it is not of this observation to deduce if the light has had any different speed before 1987. The notion of c-decay is currently out of favor even among creationists. Two models for the creation of the universe, i.e. white hole cosmology and cosmological relativity, both assume a constant value of c. The Anisotropic Synchrony Convention holds for a variable value for c, and likewise provides for c to be relative to the speed of the emitting object. Anisotropism is the actual de-facto convention for Scripture, as God describes things from a human's-eye point-of-view. Even Christ said he would use earthly things to describe heavenly things. The human point-of-view is integrated to the Anisotropic convention, providing for the instantaneous arrival of distant starlight as well as explain local measurement in terms of time dilation. - Biography of Albert A. Michelson from the Nobel Committee - An Alternate View of SN1987A by Selva Harris. - Speed of light may have changed recently by Eugenie Samuel Reich, NewScientist.com
<urn:uuid:f5558b05-9aa3-4b0f-b3c4-87ff10da7327>
CC-MAIN-2014-15
http://creationwiki.org/Speed_of_light
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206770.7/warc/CC-MAIN-20140423032006-00055-ip-10-147-4-33.ec2.internal.warc.gz
en
0.955387
1,547
4.21875
4
Portland, Ore. - In the search for a physical system that could encode quantum states and thus form the basis for a practical quantum computer, researchers at the University of Michigan and the University of Rochester are turning to photonics. Photons, like electrons, are quantum particles and can be manipulated with optical devices. By making use of semiconductor structures such as acousto-optic modulators or quantum wells, photons can modify the quantum states of electrons. In a recent experiment at the University of Michigan, researchers used a magnetic semiconductor material that confined electrons in a quantum well. Subsequently lasing the well with ultrafast pulses entangled the electrons' spin states. Entanglement is the fundamental basis for quantum computing. "After studying the results of others who have tried all kinds of different approaches to controlling qubits [quantum bits], we found a method based on semiconductor technology that, when combined with advances in nanotechnology, we think holds great promise for practical implementations," said professor Roberto Merlin, a physicist on the project at the university's Optical Physics Interdisciplinary Laboratory. Another project, at the University of Rochester's Center for Quantum Information, is using methods based on nonlinear optical waveguides to investigate both quantum entanglement between photons and more conventional physics based on photon interference. The work, led by Ian Walmsley, a physicist specializing in ultrafast phenomena, has seen some success on both fronts. Though not a pure quantum-state operation, photon interference has turned out to be useful in decoding quantum states and might serve as a practical I/O method for a quantum pro-cessor, the Rochester team reports. In addition, the optical interference techniques developed at the lab could be applied to quantum communications over optical fibers, an area that has recently spawned an actual prototype of a secure communications system based on quantum principles. The Rochester team has developed a new type of high-brightness optical source that achieves tight control of a photon's wavefunction inside of an optical waveguide. The physical technique is to use phase matching to control two-photon interactions. Confining the photons in the waveguide cavity has allowed the researchers to first entangle and then disentangle photon states. While these experiments have been successful in generating two pairs of entangled photons, the problem facing the researchers is how to generate a large number of pairs in order to achieve some practical information-encoding ability. The probability of generating stable pairs decreases exponentially with the number of pairs. Researchers worldwide are searching for semiconductors that can house quantum states due to the computational boost that quantum information processing could achieve. Today, experimental single-electron transistors can represent only a digital "1" or "0," depending upon whether the charge is present or absent. However, quantum states encode bits in what is known as a "superposition of states," which means that a single electron or photon can represent both logical values simultaneously. A quantum parameter such as an electron's spin state can be used as the representation of a qubit. As long as the spin of an electron is undisturbed, the qubit represents both a 1 and a 0 simultaneously. When the spin of one electron interacts with another, the result can perform parallel computations on all the values encoded into their wavefunction. Unfortunately, the very thing that makes quantum systems useful-their ability to superpose values-makes them even more prone to errors than classical systems. The nebulous state of qubits can be destroyed by a wide variety of factors, all of which boil down to an inadvertent coupling to the environment, resulting in decoherence of the superposed values. To solve this problem, quantum error-correction methods were proposed as early as 1995 and first demonstrated in 1998. Since then, many groups have refined quantum error-correction encoding techniques, which basically replicate a nebulous qubit's value onto separate physical systems that are "entangled"-that is, their nebulous values are synchronized over time despite different physical locations. Entanglement enables observers to subsequently "compare" the resultant qubits after a calculation, without "observing" their nebulous values, to see if any differences arose between the copies. Such differences indicate an error, which usually resets the system to try that calculation over again. Entanglement also aids in cryptography by being able to detect eavesdropping. In the University of Michigan work, Merlin's group achieved entanglement of three noninteracting electrons, by virtue of a 5-watt, 532-nanometer laser producing 130-femtosecond pulses at 82 MHz, focused down to a dot with a diameter of 400 microns. Each laser pulse supplied the energy to create what physicists call an exciton-a bound electron-hole pair-with a diameter of about 5 nm in a cadmium-tellurium quantum well. Electrons within that radius from donor manganese impurities in the quantum well became entangled. In the experiment, three such noninteracting electrons were entangled. "The source of our qubits is electrons bound to donors-here, manganese impurities in a cadmium-tellurium quantum well," said Merlin. "In principle we could entangle thousands of electrons, making our method very scalable." The formation of excitons from an electron-hole pair is a coulomb interaction, here resulting from the optical energy added by the laser to confined paramagnetic manganese impurities in the presence of a magnetic field. The distance between the electron and hole within the exciton is called the Bohr radius-in this case, it's 5 nm. Excitons typically move freely within a bulk semiconductor, but when the exciton is trapped in a well, thin wire or quantum dot with dimensions of the same order as the exciton, a confinement effect occurs. A quantum well confines the exciton in only one dimension, leaving it free in the other two, while a quantum wire confines it in two dimensions, leaving it only one dimension in which to move. A quantum dot confines the exciton in all three dimensions. "We have shown that electrons can be optically excited to generate many-spin Raman coherences in nonoverlapping ex-citons," Merlin said. "Our procedure is potentially set-specific and scalable for quantumcomputing applications." In the experiment, the manganese electrons within the radius of the exciton became entangled after three laser bursts. With repeated laser bursts, Merlin proposes to entangle an arbitrary number of electrons using his semiconductor-based method. The entanglement was attributed to resonant transitions between Zeeman split spin states-which can be sensed by detecting a harmonic of the fundamental Zeeman frequency that corresponds to the number of entangled electrons. In the experiment, three electrons were entangled, shown by detecting the third harmonic of the Zeeman frequency. "Our method, relying on the exchange interaction between localized excitons and paramagnetic impurities, can in principle be applied to entangle an arbitrarily large number of spins," said Merlin. Next Merlin intends to use a masking method to make it possible to aim the laser beam at specific regions of the semiconductor, so that the semiconductor device can be addressed randomly. "Reading and writing we have demonstrated here, but only for an ensemble of electrons. Right now it's 'almost' like having a quantum computer, except that we are turning on and off all the bits at the same time. Next we want to use masking to selectively address individual qubits," said Merlin. Also on Merlin's drawing board is a more refined laser pulse that in addition to forming arbitrary excitons also assists in performing specific quantum calculations. "We want to use pulse shaping to put a little bump here or a spike there," he said. "We think that by shaping the pulse we can control the entire wavefunction of the electron, which you will need to do to perform quantum computations." Merlin's research was funded by ACS Petroleum Research Fund, the National Science Foundation and the Air Force Office of Scientific Research. The lab is part of Michigan's Frontiers in Optical Coherent and Ultrafast Science Center. - Chappell Brown contributed to this report
<urn:uuid:955b272d-b2a3-4c90-acde-239cc001507a>
CC-MAIN-2014-15
http://www.eetimes.com/document.asp?doc_id=1145653
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609535095.7/warc/CC-MAIN-20140416005215-00606-ip-10-147-4-33.ec2.internal.warc.gz
en
0.925023
1,685
3.59375
4
*** For immediate use April 19, 2012Long predicted but never observed, coherent quantum phase slip can be harnessed to develop a novel class of quantum devicesA new type of quantum bit called a "phase-slip qubit", devised by researchers at the RIKEN Advanced Science Institute and their collaborators, has enabled the world's first-ever experimental demonstration of coherent quantum phase slip (CQPS). The groundbreaking result sheds light on an elusive phenomenon whose existence, a natural outcome of the hundred-year-old theory of superconductivity, has long been speculated, but never actually observed. Superconductivity describes a phenomenon in which electrons pass through certain types of materials without any resistance when cooled below a given temperature. Among the most important applications of superconductivity is the Josephson junction, named after physicist Brian Josephson, who in 1962 predicted that a superconducting current could tunnel between superconductors separated by a thin insulating layer. This phenomenon, the Josephson effect, has been applied in a variety of areas including magnetometer design, voltage standardization, and quantum computing. Researchers have long known of an intriguing theoretical parallel to the Josephson effect in which insulator and superconductor are reversed: rather than electric charges jumping from one superconducting layer to another across an insulating layer, magnetic flux quanta jump from one insulator to another across a superconducting layer (Figure 1). Quantum tunneling of electrons in the Josephson junction is replaced in this parallel by the coherent "slip" of the phase, a quantum variable that, in superconducting circuits, plays a dual role to that of electric charge.Coherent quantum phase slip (CQPS), as this phenomenon is known, has long been limited to theory-but no more. In a paper in Nature, Oleg Astafiev and colleagues at the RIKEN Advanced Science Institute (ASI) and NEC Smart Energy Research Laboratories report on the first direct observation of CQPS in a narrow superconducting wire of indium-oxide (InOx). The wire is inserted into a larger superconducting loop to form a new device called a phase-slip qubit, with the superconducting layer (the thin wire) sandwiched between insulating layers of empty space (Figure 2).By tuning the magnetic flux penetrating this loop while scanning microwave frequencies, the researchers detected a band gap in the energy curves for the two flux states of the system (Figure 3), just as theory predicts. This gap is a result of quantum mechanics, which prevents the two states from occupying the same energy level, forcing them to tunnel across the superconducting layer-and through a quantum phase-slip in the narrow wire-to avoid it. While demonstrating conclusively the existence of CQPS, the successful experiment also ushers in a novel class of devices that exploit the unique functionality of quantum phase-slip to forge a new path in superconducting electronics.For more information, please contact:Tsai Jaw-ShenMacroscopic Quantum Coherence TeamRIKEN Advanced Science InstituteTel: +81-(0)29-850-1161 / Fax: +81-(0)29-850-2624Global Relations OfficeRIKENTel: +81-(0)48-462-1225 / Fax: +81-(0)48-463-3687Email: firstname.lastname@example.orgReach us on Twitter: @rikenresearchReferenceO. V. Astafiev, L. B. Ioffe, S. Kafanov, Yu. A. Pashkin, K. Yu. Arutyunov, D. Shahar, O. Cohen, & J. S. Tsai. "Coherent quantum phase slip." Nature, 2012, DOI: 10.1038/nature10930 About RIKENRIKEN is Japan's flagship research institute devoted to basic and applied research. Over 2500 papers by RIKEN researchers are published every year in reputable scientific and technical journals, covering topics ranging across a broad spectrum of disciplines including physics, chemistry, biology, medical science and engineering. RIKEN's advanced research environment and strong emphasis on interdisciplinary collaboration has earned itself an unparalleled reputation for scientific excellence in Japan and around the world.About the Advanced Science Institute.The RIKEN Advanced Science Institute (ASI) is an interdisciplinary research institute devoted to fostering creative, curiosity-driven basic research and sowing the seeds for innovative new projects. With more than 700 full-time researchers, the ASI acts as RIKEN's research core, supporting inter-institutional and international collaboration and integrating diverse scientific fields including physics, chemistry, engineering, biology and medical science.About NECNEC Corporation is a leader in the integration of IT and network technologies that benefit businesses and people around the world. By providing a combination of products and solutions that cross utilize the company's experience and global resources, NEC's advanced technologies meet the complex and ever-changing needs of its customers. NEC brings more than 100 years of expertise in technological innovation to empower people, businesses and society. For more information, visit NEC athttp://www.nec.com.
<urn:uuid:aa188914-8f76-4e72-ba20-06dc72e21f49>
CC-MAIN-2014-15
http://www.jpubb.com/en/press/51440/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223210034.18/warc/CC-MAIN-20140423032010-00064-ip-10-147-4-33.ec2.internal.warc.gz
en
0.888153
1,057
3.859375
4
In contrast to classical bits of information that are either or , quantum bits—or “qubits”—can be in superposition states of and . Just like classical bits, however, qubits are physical objects that have to be implemented in real physical systems. Researchers have used single photons as physical qubits, with the quantum information encoded in terms of polarization, angular momentum, and many other degrees of freedom. The time-bin degree of freedom (that is, encoding quantum information in terms of relative arrival times of light pulses) offers a particularly robust kind of single-photon qubits, and two recent papers have advanced the use of time-bin qubits in dramatic ways. Writing in Physical Review Letters, Peter Humphreys and colleagues at the University of Oxford, UK, have developed a technique for optical quantum computing using time-bin qubits . In principle, their concept allows photonic quantum computing using a single optical path (or fiber) rather than a maze of multiple paths, thereby drastically reducing the overall complexity of these kinds of systems. Also in Physical Review Letters, John Donohue and colleagues at the Institute for Quantum Computing, University of Waterloo, Canada, have demonstrated an ultrafast measurement technique for time-bin qubits that could enable higher data rates and fewer errors in photonic systems . These two developments represent a huge step towards the realization of practical quantum information processing devices using single-photon qubits. Time-bin qubits were originally developed by a group at the University of Geneva, Switzerland . To understand the basic form of these qubits, consider a single-photon wave packet passing through a two-path Mach-Zehnder interferometer: if the two paths have different lengths, the photon wave packet will exit the interferometer in a quantum-mechanical superposition of an “early time bin” and “later time bin.” By adjusting the parameters of the interferometer to control relative phase and amplitude, one can accurately produce arbitrary time-bin qubits. The Geneva group famously showed that these time-bin qubits could propagate over long distances in optical fibers with very little decoherence, allowing much more robust quantum communication systems than those based on polarization-encoded qubits [4, 5]. Extending these ideas from the realm of quantum communication, Humphreys et al. have now shown that it is possible to use time-bin qubits for quantum computing . Their approach is based on the well-known linear optics quantum computing (LOQC) paradigm that uses large numbers of ancilla photons and measurement-based nonlinearities to realize near-deterministic quantum logic gates . Previous work on the LOQC approach has primarily been based on polarization qubits and spatial modes that can quickly escalate into extremely unwieldy nested interferometers with very large numbers of paths that need to be stabilized to subwavelength precision [6, 7, 8]. In contrast, Humphreys et al. have now shown that the use of time-bin qubits enables the LOQC approach in a single spatial mode, offering the possibility of far less experimental complexity and a potential for reduced decoherence mechanisms. As shown in Fig. 1, their approach involves a large string of time-bin qubits propagating along a single waveguide (such as an optical fiber), with the available polarization degree of freedom used to define a “register” mode for propagation and storage, and a “processing” mode for qubit manipulations. As the qubits propagate along the waveguide, Humphreys et al. pull out various time bins from the register mode, process them with phase shifts, bit flips, and couplings, and then return them to the register mode in a coherent way. The authors used these ideas to propose the full suite of single-qubit operations and two-qubit entangling gates needed for universal quantum computation. The validity of their basic method was demonstrated in a very convincing experiment that used single-photon qubits and linear optical elements for time-bin creation and manipulation . In any approach to quantum information processing, one of the key requirements is the ability to measure arbitrary qubit states. For the time-bin qubits discussed here, this turns out to mean that the separation between the “early” and “late” time bins has to be much greater than the resolution time of the photon detection system being used. With commercially available devices, this typically requires nanosecond-scale separation of the time bins and limits the effective “data rate” for sending time-bin qubits down a quantum channel. Using a radical departure from traditional time-bin qubit detection techniques, Donohue et al. have now pushed this number down to the picosecond scale, offering the potential for much higher information density . The approach of Donohue et al. is essentially a clever method for coherently converting time bins into “frequency bins” that can be easily measured with slow detectors—even when the time bins are pushed arbitrarily close together. As illustrated in the inset to Fig. 1, this time-to-frequency conversion is based on qubit frequency conversion techniques that mix a single-photon qubit with an auxiliary strong laser pulse in a nonlinear medium . By oppositely “chirping” the qubit and strong laser signals (i.e., stretching them so that their frequencies vary oppositely in time—like mirror-image rainbows), the authors were able to show that the time-bin information maps perfectly into corresponding frequency bins. The real power of the technique—the ability to make measurements of arbitrary time-bin qubits—arises when the auxiliary laser pulse is also put into a superposition of time bins. Using this approach, Donohue et al. were able to experimentally demonstrate ultrafast measurements on arbitrary time-bin states . The next steps for moving these two new promising ideas from the research lab towards “practical quantum information processing devices” will be of a more technical nature. For Humphrey’s time-bin LOQC approach, this simply means an emphasis on improving the efficiency of the photonics technologies (switches, phase shifters, etc.) needed, while for Donohue’s ultrafast time-bin qubit detectors, it means improving the efficiency of the time-to-frequency conversion process. Combining these ideas with other recent advances in photonic quantum information processing is also an exciting prospect. For example, chip-based devices have recently demonstrated remarkable stability , and a hybrid scheme involving several spatial modes with Humphrey’s temporal methods and Donohue’s ultrafast detection scheme may enable near-term realizations of quantum circuits with more than “a few” single-photon qubits. - P. C. Humphreys, B. J. Metcalf, J. B. Spring, M. Moore, X-M. Jin, M. Barbieri, W. S. Kolthammer, and I. A. Walmsley, “Linear Optical Quantum Computing in a Single Spatial Mode,” Phys. Rev. Lett. 111, 150501 (2013). - J. M. Donohue, M. Agnew, J. Lavoie, and K. J. Resch, “Coherent Ultrafast Measurement of Time-Bin Encoded Photons,” Phys. Rev. Lett. 111, 153602 (2013). - J. Brendel, N. Gisin, W. Tittel, and H. Zbinden, “Pulsed Energy-Time Entangled Twin-Photon Source for Quantum Communication,” Phys. Rev. Lett. 82, 2594 (1999). - I. Marcikic, H. de Riedmatten, W. Tittel, H. Zbinden, M. Legré, and N. Gisin, “Distribution of Time-Bin Entangled Qubits over 50 km of Optical Fiber,” Phys. Rev. Lett. 93, 180502 (2004). - J. D. Franson, “Bell Inequality for Position and Time,” Phys. Rev. Lett. 62, 2205 (1989). - E. Knill, R. LaFlamme, and G. J. Milburn, “A Scheme for Efficient Quantum Computation with Linear Optics,” Nature (London) 409, 46 (2001). - T. B. Pittman, M. J. Fitch, B. C. Jacobs, and J. D. Franson, “Experimental Controlled-NOT Logic Gate for Single Photons in the Coincidence Basis,” Phys. Rev. A 68,032316 (2003). - J. L. O’Brien, G. J. Pryde, A. G. White, T. C. Ralph, and D. Branning, “Demonstration of an All-Optical Quantum Controlled-NOT Gate,” Nature (London) 426, 264 (2003). - J. Huang and P. Kumar, “Observation of Quantum Frequency Conversion,” Phys. Rev. Lett. 68, 2153 (1992). - A. Politi, M. J. Cryan, J. G. Rarity, S. Yu, and J. L. O’Brien, “Silica-on-Silicon Waveguide Quantum Circuits,” Science 320, 646 (2008).
<urn:uuid:95e88dc9-26ca-417f-b985-3926717e77ea>
CC-MAIN-2014-15
http://physics.aps.org/articles/print/v6/110
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1397609527423.39/warc/CC-MAIN-20140416005207-00576-ip-10-147-4-33.ec2.internal.warc.gz
en
0.878795
1,998
3.65625
4
Tiny 'spherules' reveal details about Earth's asteroid impacts Researchers are learning details about asteroid impacts going back to Earth's early history by using a new method for extracting precise information from tiny "spherules" embedded in layers of rock. The spherules were created when asteroids crashed into Earth, vaporizing rock that expanded into space as a giant vapor plume. Small droplets of molten and vaporized rock in the plume condensed and solidified, falling back to Earth as a thin layer. The round or oblong particles were preserved in layers of rock, and now researchers have analyzed them to record precise information about asteroids impacting Earth from 3.5 billion to 35 million years ago. "What we have done is provide the foundation for understanding how to interpret the layers in terms of the size and velocity of the asteroid that made them," said Jay Melosh, an expert in impact cratering and a distinguished professor of earth and atmospheric sciences, physics and aerospace engineering at Purdue University. Findings, which support a theory that Earth endured an especially heavy period of asteroid bombardment early in its history, are detailed in a research paper appearing online in the journal Nature on April 25. The paper was written by Purdue physics graduate student Brandon Johnson and Melosh. The findings, based on geologic observations, support a theoretical study in a companion paper in Nature by researchers at the Southwest Research Institute in Boulder, Colo. The period of heavy asteroid bombardment -- from 4.2 to 3.5 billion years ago -- is thought to have been influenced by changes in the early solar system that altered the trajectory of objects in an asteroid belt located between Mars and Jupiter, sending them on a collision course with Earth. "That's the postulate, and this is the first real solid evidence that it actually happened," Melosh said. "Some of the asteroids that we infer were about 40 kilometers in diameter, much larger than the one that killed off the dinosaurs about 65 million years ago that was about 12-15 kilometers. But when we looked at the number of impactors as a function of size, we got a curve that showed a lot more small objects than large ones, a pattern that matches exactly the distribution of sizes in the asteroid belt. For the first time we have a direct connection between the crater size distribution on the ancient Earth and the sizes of asteroids out in space." Because craters are difficult to study directly, impact history must be inferred either by observations of asteroids that periodically pass near Earth or by studying craters on the moon. Now, the new technique using spherules offers a far more accurate alternative to chronicle asteroid impacts on Earth, Melosh said. "We can look at these spherules, see how thick the layer is, how big the spherules are, and we can infer the size and velocity of the asteroid," Melosh said. "We can go back to the earliest era in the history of Earth and infer the population of asteroids impacting the planet." For asteroids larger than about 10 kilometers in diameter, the spherules are deposited in a global layer. "Some of these impacts were several times larger than the Chicxulub impact that killed off the dinosaurs 65 million years ago," Johnson said. "The impacts may have played a large role in the evolutional history of life. The large number of impacts may have helped simple life by introducing organics and other important materials at a time when life on Earth was just taking hold." A 40-kilometer asteroid would have wiped out everything on Earth's surface, whereas the one that struck 65 million years ago killed only land animals weighing more than around 20 kilograms. "Impact craters are the most obvious indication of asteroid impacts, but craters on Earth are quickly obscured or destroyed by surface weathering and tectonic processes," Johnson said. "However, the spherule layers, if preserved in the geologic record, provide information about an impact even when the source crater cannot be found." The Purdue researchers studied the spherules using computer models that harness mathematical equations developed originally to calculate the condensation of vapor. "There have been some new wrinkles in vapor condensation modeling that motivated us to do this work, and we were the first to apply it to asteroid impacts," Melosh said. The spherules are about a millimeter in diameter. The researchers also are studying a different type of artifact similar to spherules but found only near the original impact site. Whereas the globally distributed spherules come from the condensing vaporized rock, these "melt droplets" are from rock that's been melted and not completely vaporized. "Before this work, it was not possible to distinguish between these two types of formations," Melosh said. "Nobody had established criteria for discriminating between them, and we've done that now." One of the authors of the Southwest Research Institute paper, David Minton, is now an assistant professor of earth and atmospheric sciences at Purdue. Findings from the research may enable Melosh's team to enhance an asteroid impact effects calculator he developed to estimate what would happen if asteroids of various sizes were to hit Earth. The calculator, "Impact: Earth!" allows anyone to calculate potential comet or asteroid damage based on the object's mass. The research has been funded by NASA. Source: Purdue University - 'Spherules' tell of asteroid impactsfrom UPIWed, 25 Apr 2012, 22:00:21 EDT - Tiny 'spherules' reveal details about Earth's asteroid impactsfrom Science DailyWed, 25 Apr 2012, 16:31:03 EDT - Asteroid orbs offer more precise data than cratersfrom CBC: Technology & ScienceWed, 25 Apr 2012, 14:00:20 EDT - Tiny 'spherules' reveal details about Earth's asteroid impactsfrom PhysorgWed, 25 Apr 2012, 13:00:39 EDT Latest Science NewsletterGet the latest and most popular science news articles of the week in your Inbox! It's free! Check out our next project, Biology.Net From other science news sites Popular science news articles - Hearing quality restored with bionic ear technology used for gene therapy - NASA satellites show drought may take toll on Congo rainforest - Superconducting qubit array points the way to quantum computers - Scientists identify source of mysterious sound in the Southern Ocean - From liability to viability: Genes on the Y chromosome prove essential for male survival - Criticism of violent video games has decreased as technology has improved, gamers age - Hummingbirds' 22-million-year-old history of remarkable change is far from complete - Research clarifies health costs of air pollution from agriculture - Ancient 'spider' images reveal eye-opening secrets - New research finds 'geologic clock' that helps determine moon's age
<urn:uuid:abd7bbb7-edbf-4600-93d1-2be5de8f4fb2>
CC-MAIN-2014-15
http://esciencenews.com/articles/2012/04/25/tiny.spherules.reveal.details.about.earths.asteroid.impacts
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223206770.7/warc/CC-MAIN-20140423032006-00056-ip-10-147-4-33.ec2.internal.warc.gz
en
0.943026
1,406
4.34375
4
A new light source for quantum computers 20.02.13 - Researchers have discovered a new way of emitting photons one at a time. They have constructed semiconductor nanowires with "quantum dots" of unprecedented quality - a discovery with implications for the future of quantum computing. In a future of quantum computing, data will be treated and transmitted by lasers. The quantum properties of light will endow machines with gigantic computing potential and an incredible execution rate. However, much work remains to be done. In order to exploit the "quantum" potential of light it is necessary, among other things, to be able easily to emit photons one by one. The "natural" creation of a photon filter At the heart of the Laboratory of Semiconductor Materials (LMSC) of Institute of Materials, the team of Anna Fontcuberta i Morral has discovered a new method for creating a miniscule and extremely high-performance single-photon source. She has found that "quantum dots", or nanocrystals, appear naturally on a certain kind of semiconductor nanowire during their fabrication process. The final structure can then emit photos one by one, after having absorbed light. Her discovery is the subject of an article in Nature materials. The hidden qualities of nanowires Nanowires, with a diametre of around a millionth of a millimetre (between 20 and 100 nanometres) are very efficient at absorbing and manipulating light. By endowing them with nanocrystals or "quantum dots" it is possible to make them emit unique photons, by charging them with a laser beam of a particular frequency. The only hitch is that generating quantum dots on a nanowire is notoriously difficult. The existing methods, which involve the use of a regular modulation of the composition of the nanowire all along its length, are hard to reproduce and result in strucures with a relatively low output of photons. Discovery through observation Scientists at the LMSC discovered that perfectly functional quantum dots formed "naturally" on the surface of certain nanowires during the fabrication process. These dots appeared all by themselves at the interface between two basic components: Gallium Arsenide (GaAs) and Aluminium Arsenide (AlAs). "No doubt many scientists working on nanowires have created dots, without realising it," states Anna Fontcuberta i Morral. Adjusting nanocristals for size Many tests have been carried out in the light of this discovery, in order to prove the efficiency of this new single-photon source. "The calculations and simulations were carried out on the supercomputers of EPFL by the Laboratory of the Theory and Simulation of Materials (THEOS) of Nicola Marzari," says Prof. Morral. As a result, these structures showed a great working stability, which is rare when talking about nanotechnology. What is more they are hard wearing and very bright, which means that their rate of photon output is incredibly high. Even better, by controlling the fabrication of the nanowires, the size of the dots can be modulated and adapted to measure. The wavelength of the emitted photons, which is directly dependent on the size of the dots, can therefore be changed. It is then possible for the nanowire to receive a laser beam of a certain wavelength or "colour", in order to generate photons of a certain colour - infrared, for example. A hitherto unexplained phenomenon At the present time the phenomenon of the natural creation of dots is not understood by scientists. The study will therefore proceed in the following way: "It is also about seeing if it is possible to stimulate dots not only with lasers but electrically, in order to make them as compatible as we can with all kinds of machine," explains Anna Fontcuberta i Morral. It is worth noting that these photon sources could also be used in the domain of molecule detection, or for the perfection of methods of quantum encryption for the protection of data. * * * In a traditional computer, calculations are based on the "bit", which can have one of two values: 0 or 1. This is the essence of binary language. In a quantum computer the "qbit" (for example a photon) can have several states at once, states of superposition. It can be either 0 or 1 or both at the same time. The aim is maintain photons in their state of superposition so that the computer can carry out multiple calculations in parallel, simultaneously, which will drastically increase the speed of data manipulation. However this capacity for finding several states at once can not be achieved with a photon unless it is isolated: sources of unique photons are therefore much sought after. 1996 - 1998 Phillippe Moris 1998 - 2005 MBA Universite de Harvard 2005 - 2009 ROTH Cl Partners 222 Hu Bin Road, Shanghai 200021 Tel: 86 (21) 2308 1800 Rolex Learning Center Case postale 122 1015 Lausanne 15 Tel: +41 (0)21 693 24 91
<urn:uuid:1b40f03f-da53-459e-b940-9b6f1187bda9>
CC-MAIN-2014-15
http://actu.epfl.ch/news/a-new-light-source-for-quantum-computers/
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223210034.18/warc/CC-MAIN-20140423032010-00066-ip-10-147-4-33.ec2.internal.warc.gz
en
0.927136
1,046
3.765625
4
Java in Soft Computing The human senses interpret perceived external information, which is incomplete and imprecise, and try to form reasoning vital for survival. Fuzzy set theory provides a system to deal with such information linguistically and performs numerical computation using linguistic labels stipulated by membership functions. Selection of fuzzy if-then rules forms the key component of a fuzzy inference systems (FIS) that can appropriately model human expertise in a specific application. FIS has a structured knowledge representation in the form of if-then rules. FIS lacks the adaptability to deal with changing external environments, thus when FIS is incorporated with Neural Network or Evolutionary Computation (such as Genetic Algorithms), the resulting hybrid system is adaptive. According to Lotfi A. Zadeh, the founder of Fuzzy Set and Fuzzy Logic: "Soft Computing is an emerging approach to computing which parallels the remarkable ability of the human mind to reason and learn in an environment of uncertainty and imprecision." The major components of Soft Computing are: - Fuzzy Set and Fuzzy Logic -- the subject of this discussion. - Artificial Neural Network -- This is the modeling of the brain as a continuous-time non-linear dynamic system in connectionist architectures that are expected to mimic brain mechanisms to simulate intelligent behavior. - Evolutionary Computation -- Simulating complex biological evolutionary processes lead to understanding of how living systems acquired higher-level intelligence. Genetic Algorithm (GA) is based on the evolutionary principle of natural selection. Immune Modeling is based on the assumption that chemical and physical laws may be able to explain living intelligence. Artificial Life is a similar discipline to Immune Modeling but also attempts to realize lifelike behavior by imitating the processes that occur in the development of life. - Bayesian Learning and Statistical Reasoning -- Bayesian reasoning is an approach that provides a probabilistic nature to inference. A Bayes model is based on the assumption that the quantities of interest are governed by probability distributions and that optimal decisions can be made by reasoning about these probabilities together with observed data. Bayesian reasoning provides the basis for learning algorithms that directly manipulate probabilities, as well as framework for analyzing the operation of other algorithms that do not explicitly manipulate probabilities. The field of Soft Computing is changing and evolving rapidly with new techniques and applications constantly proposed. Although software can be developed in either one of the individual components of soft computing, there is a tendency to combine two or more components so that one will complement the shortfall of the other. Hybrid systems such as Neuro-Fuzzy (Neural Net and Fuzzy Systems), Genetic Neural Network ( Neural Net and Genetic Algorithm ), Fuzzy-Bayesian Network (Fuzzy-Logic and Bayesian Belief Network) are common these days. Where Is Java Now in Computational Intelligence and Soft Computing? Java has become very popular as a language for writing software in computational and machine intelligence these days. Java is moving fast to be on a plane with traditional artificial intelligence languages such as Lisp and Prolog to be the first choice for writing AI-based software. There is currently an important draft at the Java Community Process (JCP), Java Specification Request 73, an API for data-mining. The proposed name for this package is javax.datamining, but it has yet not been finalized. The specification lead for this expert group is from Oracle, and it is excellent to see leaders in statistical software such as SPSS and the SAS Institute get involved in drafting this specification. What is data-mining? The main goal of data-mining is to automate the extraction of hidden predictive information and patterns from (large) databases. Data-mining applies the algorithm of machine learning (computational intelligence) and soft-computing such as artificial neural network, decision trees and belief networks, fuzzy-logic if-then rules, and rule induction. There has been confusion about the meaning of data-mining among the IT community. It is not data warehousing, SQL-queries, and report or data visualization. Data-mining is a major component of today's enterprise software, such as ERP (Enterprise Resource Planning) and CRM (Customer Relational Management). The expert comment from publications in Intelligent Enterprise (a Web site for enterprise business intelligence) predicts that business intelligence enterprise software such as CRM that does not have analytical functionality will not compete well in the market. A CRM with analytical capability that only has statistical analysis is not as good as one with both statistical plus soft-computing and computational intelligence. The underlying algorithm of data-mining involves number crunching numeric computation, and it's a good move by Sun to develop such an API to make software development easier for mid-level or even entry-level Java developers who need to be involved in a data-mining project. Java developers just need to understand, by reading the API docs, what parameters need to be passed to a specific method, which removes the need to understand the complex mathematics implemented in a data-mining algorithm. Data-mining projects have always involved people who have a background and a deep knowledge of mathematics, statistics, and artificial intelligence at the Ph.D. or M.Sc. level. The upcoming javax.datamining API package will pull in Java developers from all levels, expert down to entry level for any data-mining project. Thus, one mathematician is enough to lead a group, which eliminates the need to assemble a team of Ph.D. developers. There are already a number of freeware Java software and APIs in soft computing and machine learning available online as GPL open source, with a new one available almost daily. This shows the explosive popularity of Java in the field of machine intelligence and soft computing. Evolution of Logic The following are the different types of logic and their implications or potential applications for technology: - Bivalent Logic: This is the conventional "either-or", "true-false" formulated by Aristotle; it is logic of our modern day computers. A logic gate output can be either 1 or 0 and there is no number in the middle such as 0.7. There is no such thing as uncertainty or imprecision in Bivalent Logic. - Multi-valued Logic (Fuzzy Set and Fuzzy Logic): Although modern computers and software operate using bivalent logic, it is inefficient in modeling human concepts, which tend to be imprecise and uncertain. Fuzzy Logic allows logic values to have any value between 0 and 1. ("X is a beautiful person. Y is more beautiful than X. Z is very, very beautiful.") - Quantum Logic: It is quite different from bivalent and fuzzy logic in that the truth-values interfere with each other, leading to co-existence at the same time of different values. A quantum logic gate can exist in both states at the same time or even more states concurrently. Quantum Computation explores massive parallel computing. What has used to be science fiction decades ago now becomes science fact in today's technology. Peter Shor of AT&T invented the "Shor" quantum algorithm in 1994 and showed that factoring a large integer (400-digit number or more) into prime numbers can be done very fast using a quantum computer (around 1 year time span) in comparison with billions of years using the fastest super-computer of today. Since the emergence of Shor algorithm, financial institutions and government agencies, such as the N.S.A, are aware of the potential threat of this technology. It is no surprise that the U.S. government is at the forefront of research into quantum cryptography and encryption. Even a working group at Microsoft has been established to research this alternative model of computing. When the age of Quantum Computing matures, branches of software engineering such as Data Warehousing will become obsolete, because quantum computers will do searches of millions or even billions of database records and produce reports in a matter of seconds. Speed is the limitation of the application of machine intelligence and soft computing in today's computers. One day, quantum computing will solve this limitation. In the field of Computer Vision (software that is trained to recognize the difference between a bicycle and a tree from an image, for example), today's computer are not yet fast enough to recognize figures from an image. The pattern matching of current vision technology is reasonable if the number of images to be matched is reasonably low. When the search is to be done on a massive image database, the retrieval process is going to be slow. Java is fast establishing itself in all areas of technical computation from scientific and engineering to business. With the release of Java Advanced Imaging plus Java3D, I have seen Java GPL projects that use soft computing and Computer Vision for scientific and medical imaging. Page 1 of 2
<urn:uuid:c762aab3-4909-4338-bec0-c3e2616f6695>
CC-MAIN-2014-15
http://www.developer.com/java/other/article.php/1024601/Java-in-Soft-Computing.htm
s3://commoncrawl/crawl-data/CC-MAIN-2014-15/segments/1398223210034.18/warc/CC-MAIN-20140423032010-00067-ip-10-147-4-33.ec2.internal.warc.gz
en
0.929051
1,797
3.8125
4