Saturday 31 January 2015

Researchers use sound to slow down, speed up, and block light

This is an artist visualization of slow light, fast light, and one-way light blocking using BSIT in a series of silica microresonators (slow/red, fast/blue; central yellow shows blocking effect

How do you make an optical fiber transmit light only one way?
Researchers from the University of Illinois at Urbana-Champaign have experimentally demonstrated, for the first time, the phenomenon of Brillouin Scattering Induced Transparency (BSIT), which can be used to slow down, speed up, and block light in an optical waveguide. The BSIT phenomenon permits light to travel in the forward direction while light traveling in the backward direction is strongly absorbed. This non-reciprocal behavior is essential for building isolators and circulators that are indispensible tools in an optical designer's toolkit.
In this study, the researchers demonstrated the BSIT phenomenon using nothing more complicated than a glass micro-fiber and a glass sphere adjacent to it.
"Light at certain wavelengths can be absorbed out of a thin optical waveguide by a microresonator--which is essentially a tiny glass sphere--when they are brought very close," explained Gaurav Bahl, an assistant professor of mechanical science and engineering at Illinois. "Through the BSIT phenomenon we can eliminate this opacity, i.e., we can make this system transparent again by adding another laser at a specially chosen wavelength nearby.
"The effect occurs due to the interaction of the light with sound waves present in the material, and is a new physical process that has never been seen before. The most significant aspect of our discovery is the observation that BSIT is a non-reciprocal phenomenon--the transparency is only generated one way. In the other direction, the system still absorbs light."
Time-reversal symmetry (i.e. reciprocity) is a fundamental tenet understood in most acoustic, electromagnetic, and thermodynamic contexts. Engineers are often forced to use tricks to break this time-reversal symmetry for specific device applications.
Current non-reciprocal optical devices--for example, isolators and circulators--are exclusively built using the Faraday magneto-optic effect. This method uses magnetic fields to break the time-reversal symmetry with certain specialized garnet and ferrite materials. However, these materials are challenging to obtain at the chip-scale through conventional foundry processes. Magnetic fields are also sources of interference in many applications such as cold atom microsystems. These constraints have deterred availability of Faraday effect isolators for on-chip optical systems till date.
"We have demonstrated a method of obtaining linear optical non-reciprocity that requires no magnets, can be implemented in any common optical material system without needing ferrites, and could be implemented today in any commercial optical foundry," Bahl added. "Brillouin isolators do already exist, but they are nonlinear devices requiring filtering of the scattered light. BSIT, on the other hand, is a linear non-reciprocal mechanism."
"Brillouin-Mandelstam scattering, originally discovered in the early 1920s, is the coupling of light waves and sound waves through electrostrictive optical forces and acousto-optic scattering. It is the fundamental physical process behind BSIT, and occurs in all solids, liquids, gases, and even plasmas," stated JunHwan Kim, a graduate student at Illinois and first author of the paper, "Non-Reciprocal Brillouin Scattering Induced Transparency," appearing in the journal, Nature Physics.
BSIT also enables the speeding up and slowing down of the group velocity of light. Physicists call this "fast" and "slow" light. "Slow" light techniques are extremely useful for quantum information storage and optical buffer applications. Some day, such buffers could be incorporated in quantum computers.
"While it is already known that the slow and fast light can be obtained using Brillouin scattering, our device is far smaller and uses far less power than any other previous demonstration, by several orders-of-magnitude. However, we must sacrifice bandwidth to obtain such performance," Kim added.
In their studies, Bahl's research group uses the extremely minute forces exerted by light to generate and control mechanical vibrations of microscale and nanoscale devices--a field called optomechanics. In resonant microcavities, these miniscule forces can be enhanced by many orders of magnitude. They are using these phenomena to unearth new physics behind how solids, liquids, and gases interact with light.

Story Source:
The above story is based on materials provided by University of Illinois College of Engineering. Note: Materials may be edited for content and length.

Journal Reference:
  1. JunHwan Kim, Mark C. Kuzyk, Kewen Han, Hailin Wang, Gaurav Bahl. Non-reciprocal Brillouin scattering induced transparency. Nature Physics, 2015; DOI: 10.1038/nphys3236

Engineer advances new daytime star tracker

Wallops engineer Scott Heatwole and his team are developing a precision attitude sensor or star tracker that would be able to locate points of reference, or, in other words, stars, during daylight hours. Heatwole specifically developed the technology for the Wallops Arc Second Pointer

Scientists who use high-altitude scientific balloons have high hopes for their instruments in the future. Although the floating behemoths that carry their instruments far into the stratosphere can stay aloft for days on end, data collection typically happens during the night when starlight can be detected. The instruments that operate during the day are limited in their field of view due to overbearing sunlight.

An engineer at NASA's Wallops Flight Facility (WFF), located on Virginia's Eastern Shore, is working on a low-cost, off-the-shelf solution to overcome the challenges of collecting data in daylight.
Under WFF's Balloon Program, engineer Scott Heatwole and his team are developing a precision attitude sensor or star tracker that would be able to locate points of reference, or in other words, stars, during daylight hours. These points of reference serve as landmarks that help orient the instrument so that it can find the target of interest.
The star tracker is being developed specifically for the Wallops Arc Second Pointer (WASP), which would use the star tracker's data to point a balloon-borne scientific payload with incredible accuracy and stability. Currently, WASP usually employs the commonly used ST5000 star tracker. However, this device cannot image in the daytime even at 120,000 feet where scientific balloons operate. Though relatively dark at those altitudes, the scattering of sunlight off the atmosphere can overwhelm the starlight in most star cameras.
"A precision attitude sensor capable of working in the daylight would extend science operations through the day which would significantly increase the amount of science collected," Heatwole said. "Currently, the only precision attitude sensor available in daytime is a sun sensor, and this isn't ideal because it provides only two axes of attitude and is not precise over a range of targets across the sky."
Although others have developed custom star trackers that enable around-the-clock science gathering, no one has pulled together an inexpensive, ready-to-go package that includes cameras, computers, and the algorithms necessary to process data and eliminate excess visible light in real time. "That's what we're trying to do," Heatwole said, adding that his daytime star tracker consists of a commercial firewire camera attached to a lens and baffle that help filter out visible light, allowing it to sense points of reference in the near-infrared wavelength bands.
In 2014, a prototype of the device flew on two WASP missions. The first, the flight of the HyperSpectral Imager for Climate Science (HySICS) collected radiance data as WASP pointed the instrument toward the Earth, the sun, and the Moon. The goal was to see what the star tracker saw at 120,000 feet.
The second WASP mission, launched a couple months later in October, carried the Observatory for Planetary Investigations from the Stratosphere (OPIS). Its mission was to gather time measurements of Jupiter's atmospheric structure -- a challenge for the new star tracker because the gas giant is a bright object.
"Our algorithm didn't work as we had hoped," Heatwole said, adding that it did not filter out the excess light as expected.
Heatwole, however, is unfazed. Over the coming months, he plans to fine-tune the algorithms to eliminate the extra light experienced during the OPIS mission and then retest the technology during a sounding rocket flight this summer and additional WASP missions in 2016 and 2017.
"We're trying to increase the capabilities of WASP," Heatwole explained. "No company is going to go out and build this. No one is going to develop an off-the-shelf, low-cost daytime star tracker and put all the components in one package. WASP requires an attitude sensor that is capable in the daytime. That's what we hope to create."


Story Source:
The above story is based on materials provided by NASA/Goddard Space Flight Center. Note: Materials may be edited for content and length.

Tuesday 27 January 2015

Rosetta Comet 'pouring' more water into space

Image acquired by the navigation camera on the European Space Agency's Rosetta spacecraft orbiting Comet 67P/Churyumov-Gerasimenko
There has been a significant increase in the amount of water "pouring" out of comet 67P/Churyumov-Gerasimenko, the comet on which the Rosetta mission's Philae lander touched down in November 2014.

The 2.5-mile-wide (4-kilometer) comet was releasing the earthly equivalent of 40 ounces (1.2 liters) of water into space every second at the end of August 2014. The observations were made by NASA's Microwave Instrument for Rosetta Orbiter (MIRO), aboard the European Space Agency's Rosetta spacecraft. Science results from the MIRO team were released today as part of a special Rosetta-related issue of the journal Science.
"In observations over a period of three months [June through August, 2014], the amount of water in vapor form that the comet was dumping into space grew about tenfold," said Sam Gulkis, principal investigator of the MIRO instrument at NASA's Jet Propulsion Laboratory in Pasadena, California, and lead author of a paper appearing in the special issue. "To be up close and personal with a comet for an extended period of time has provided us with an unprecedented opportunity to see how comets transform from cold, icy bodies to active objects spewing out gas and dust as they get closer to the sun."
The MIRO instrument is a small and lightweight spectrometer that can map the abundance, temperature and velocity of cometary water vapor and other molecules that the nucleus releases. It can also measure the temperature up to about one inch (two centimeters) below the surface of the comet's nucleus. One reason the subsurface temperature is important is that the observed gases likely come from sublimating ices beneath the surface. By combining information on both the gas and the subsurface, MIRO will be able to study this process in detail.
Also in the paper released today, the MIRO team reports that 67P spews out more gas from certain locations and at certain times during its "day." The nucleus of 67P consists of two lobes of different sizes (often referred to as the "body" and "head" because of its duck-like shape), connected by a neck region. A substantial portion of the measured outgassing from June through September 2014 occurred from the neck region during the afternoon.
"That situation may be changing now that the comet is getting warmer," said Gulkis. "MIRO observations would need to be carefully analyzed to determine which factors in addition to the sun's warmth are responsible for the cometary outgassing."
Observations are continuing to search for variability in the production rate and changes in the parts of the nucleus that release gas as the comet's distance from the sun changes. This information will help scientists understand how comets evolve as they orbit and move toward and then away from the sun. The gas production rate is also important to the Rosetta navigation team controlling the spacecraft, as this flowing gas can alter the trajectory of the spacecraft.
In another 67P paper released today, it was revealed that the comet's atmosphere, or coma, is much less homogenous than expected and that comet outgassing varies significantly over time.
"If we would have just seen a steady increase of gases as we closed in on the comet, there would be no question about the heterogeneity of the nucleus," said Myrtha Hässig, a NASA-sponsored scientist from the Southwest Research Institute in San Antonio. "Instead we saw spikes in water readings, and a few hours later, a spike in carbon dioxide readings. This variation could be a temperature effect or a seasonal effect, or it could point to the possibility of comet migrations in the early solar system."
The measurements on the coma were made by the Rosetta Orbiter Spectrometer for Ion and Neutral Analysis Double Focusing Mass Spectrometer (ROSINA DFMS) instrument. Measuring the in situ coma composition at the position of the spacecraft, ROSINA data indicate that the water vapor signal is strongest overall. However, there are periods when the carbon monoxide and carbon dioxide abundances rival that of water.
"Taken together, the MIRO outgassing results and results about heterogeneous fountains from ROSINA suggest fascinating new details to be learned about how comets work,"said Claudia Alexander, NASA project scientist for the U.S. Rosetta team, from JPL. "These results are helping us move the field forward on how comets operate on a fundamental level."
Rosetta is currently about 107 million miles (171 million kilometers) from Earth and about 92 million miles (148 million kilometers) from the sun. Comets are time capsules containing primitive material left over from the epoch when the sun and its planets formed. By studying the gas, dust and structure of the nucleus and organic materials associated with the comet, via both remote and in situ observations, the Rosetta mission should become a key to unlocking the history and evolution of our solar system, as well as answering questions regarding the origin of Earth's water and perhaps even life. Rosetta is the first mission in history to rendezvous with a comet, escort it as it orbits the sun, and deploy a lander to its surface.



Story Source:
The above story is based on materials provided by NASA/Jet Propulsion Laboratory. Note: Materials may be edited for content and length.


Journal Reference:

  1. S. Gulkis, M. Allen, P. von Allmen, G. Beaudin, N. Biver, D. Bockelee-Morvan, M. Choukroun, J. Crovisier, B. J. R. Davidsson, P. Encrenaz, T. Encrenaz, M. Frerking, P. Hartogh, M. Hofstadter, W.-H. Ip, M. Janssen, C. Jarchow, S. Keihm, S. Lee, E. Lellouch, C. Leyrat, L. Rezac, F. P. Schloerb, T. Spilker. Subsurface properties and early activity of comet 67P/Churyumov-Gerasimenko. Science, 2015; 347 (6220): aaa0709 DOI: 10.1126/science.aaa0709

New research re-creates planet formation, super-Earths and giant planets in the laboratory

New laser-driven shock compression experiments on stishovite, a high-density form of silica, provide thermodynamic and electrical conductivity data at unprecedented conditions and reveal the unusual properties of rocks deep inside large exoplanets and giant planets

New laser-driven compression experiments reproduce the conditions deep inside exotic super-Earths and giant planet cores, and the conditions during the violent birth of Earth-like planets, documenting the material properties that determined planet formation and evolution processes.

The experiments, reported in the Jan. 23 edition of Science, reveal the unusual properties of silica -- the key constituent of rock -- under the extreme pressures and temperatures relevant to planetary formation and interior evolution.
Using laser-driven shock compression and ultrafast diagnostics, Lawrence Livermore National Laboratory (LLNL) physicist Marius Millot and colleagues from Bayreuth University (Germany), LLNL and the University of California, Berkeley were able to measure the melting temperature of silica at 500 GPa (5 million atmospheres), a pressure comparable to the core-mantle boundary pressure for a super-Earth planet (5 Earth masses), Uranus and Neptune. It also is the regime of giant impacts that characterize the final stages of planet formation.
"Deep inside planets, extreme density, pressure and temperature strongly modify the properties of the constituent materials," Millot said. "How much heat solids can sustain before melting under pressure is key to determining a planet's internal structure and evolution, and now we can measure it directly in the laboratory."
In combination with prior melting measurements on other oxides and on iron, the new data indicate that mantle silicates and core metal have comparable melting temperatures above 300-500 GPa, suggesting that large rocky planets may commonly have long-lived oceans of magma -- molten rock -- at depth. Planetary magnetic fields can be formed in this liquid-rock layer.
"In addition, our research suggests that silica is likely solid inside Neptune, Uranus, Saturn and Jupiter cores, which sets new constraints on future improved models for the structure and evolution of these planets," Millot said.
Those advances were made possible by a breakthrough in high-pressure crystal growth techniques at Bayreuth University in Germany. There, Natalia Dubrovinskaia and colleagues managed to synthesize millimeter-sized transparent polycrystals and single crystals of stishovite, a high-density form of silica (SiO2) usually found only in minute amounts near meteor-impact craters.
Those crystals allowed Millot and colleagues to conduct the first laser-driven shock compression study of stishovite using ultrafast optical pyrometry and velocimetry at the Omega Laser Facility at the University of Rochester's Laboratory for Laser Energetics.
"Stishovite, being much denser than quartz or fused-silica, stays cooler under shock compression, and that allowed us to measure the melting temperature at a much higher pressure," Millot said. "Dynamic compression of planetary-relevant materials is a very exciting field right now. Deep inside planets hydrogen is a metallic fluid, helium rains, fluid silica is a metal and water may be superionic."
In fact, the recent discovery of more than 1,000 exoplanets orbiting other stars in our galaxy reveals the broad diversity of planetary systems, planet sizes and properties. It also sets a quest for habitable worlds hosting extraterrestrial life and shines new light on our own solar system. Using the ability to reproduce in the laboratory the extreme conditions deep inside giant planets, as well as during planet formation, Millot and colleagues plan to study the exotic behavior of the main planetary constituents using dynamic compression to contribute to a better understanding of the formation of the Earth and the origin of life.


Story Source:
The above story is based on materials provided by DOE/Lawrence Livermore National Laboratory. Note: Materials may be edited for content and length.


Journal Reference:
  1. M. Millot, N. Dubrovinskaia, A.  ernok, S. Blaha, L. Dubrovinsky, D. G. Braun, P. M. Celliers, G. W. Collins, J. H. Eggert, R. Jeanloz. Shock compression of stishovite and melting of silica at planetary interior conditions. Science, 2015; 347 (6220): 418 DOI: 10.1126/science.1261507

Yes, black holes exist in gravitational theories with unbounded speeds of propagation


Lorentz invariance (LI) is a cornerstone of modern physics, and strongly supported by observations.

In fact, all the experiments carried out so far are consistent with it, and no evidence to show that such a symmetry needs to be broken at a certain energy scale. Nevertheless, there are various reasons to construct gravitational theories with broken LI. In particular, our understanding of space-times at Plank scale is still highly limited, and the renomalizability and unitarity of gravity often lead to the violation of LI.
One concrete example is the Horava theory of quantum gravity, in which the LI is broken in the ultraviolet (UV), and the theory can include higher-dimensional spatial derivative operators, so that the UV behavior is dramatically improved and can be made (power-counting) renormalizable.
On the other hand, the exclusion of high-dimensional time derivative operators prevents the ghost instability, whereby the unitarity of the theory -- a problem that has been faced since 1977 [ K.S. Stelle, Phys. Rev. D16, 953 (1977)] -- is assured. In the infrared (IR) the lower dimensional operators take over, whereby a healthy low-energy limit is presumably resulted.
However, once LI is broken different species of particles can travel with different velocities, and in certain theories , such as the Horava theory mentioned above, they can be even arbitrarily large. This suggests that black holes may not exist at all in such theories, as any signal initially trapped inside a horizon can penetrate it and propagate to infinity, as long as the signal has sufficiently large velocity (or energy). This seems in a sharp conflict with current observations, which strongly suggest that black holes exist in our universe [R. Narayan and J.E. MacClintock, Mon. Not. R. Astron. Soc., 419, L69 (2012)].
A potential breakthrough was made recently by Blas and Sibiryakov [D. Blas and S. Sibiryakov, Phys. Rev. D84, 124043 (2011)], who found that there still exist absolute causal boundaries, the so-called universal horizons, and particles even with infinitely large velocities would just move around on these boundaries and cannot escape to infinity.
This has immediately attracted lot of attention. In particular, it was shown that the universal horizon radiates like a blackbody at a fixed temperature, and obeys the first law of black hole mechanics [P. Berglund, J. Bhattacharyya, and D. Mattingly, Phys. Rev. D85, 124019 (2012); Phys. Rev. Lett. 110, 071301 (2013)]. The main idea is as follows: In a given space-time, a globally timelike foliation parametrized by a scalar field, the so-called khronon, might exist.
Then, there is a surface at which the khronon diverges, while physically nothing singular happens there, including the metric and the space-time. Given that the khronon defines an absolute time, any object crossing this surface from the interior would necessarily also move back in absolute time, which is something forbidden by the definition of the causality of the theory. Thus, even particles with superluminal velocities cannot penetrate this surface, once they are trapped inside it.
In all studies of universal horizons carried out so far the khronon is part of the gravitational theory involved. To generalize the conception of the universal horizons to any gravitational theory with broken LI, recently Lin, Abdalla, Cai and Wang promoted the khronon to a test field, a similar role played by a Killing vector, so its existence does not affect the given space-time, but defines the properties of it.
By this way, such a field is no longer part of the underlaid gravitational theory and it may or may not exist in a given space-time, depending on the properties of the space-time considered. Then, they showed that the universal horizons indeed exist, by constructing concrete static charged solutions of the Horava gravity. More important, they showed that such horizons exist not only in the IR limit of the theory, as has been considered so far in the literature, but also in the full Horava theory of gravity, that is, when high-order operators are not negligible.


Story Source:
The above story is based on materials provided by World Scientific. Note: Materials may be edited for content and length.


Journal Reference:
  1. Kai Lin, Elcio Abdalla, Rong-Gen Cai, Anzhon Wang. Universal horizons and black holes in gravitational theories with broken Lorentz symmetry. International Journal of Modern Physics D, 2014; 1443004 DOI: 10.1142/S0218271814430044

Visualizing interacting electrons in a molecule

The image on left shows: Chemical structure of cobalt phthalocyanine (CoPC). The image on right shows: Experimental and theoretical wave functions of CoPC

Understanding this kind of electronic effects in organic molecules is crucial for their use in optoelectronic applications, for example in organic light-emitting diodes (OLEDs), organic field-effect transistors (OFETs) and solar cells.

In their article published in Nature Physics, the research team demonstrates measurements on the organic molecule cobalt phthalocyanine (CoPC) that can be explained only by taking into consideration how electrons in the molecule interact with each other. CoPC is a commonly used molecule in organic optoelectronic devices. Electron-electron interactions alter its conductivity, which is directly related to device performance.
The Atomic Scale Physics group at Aalto University headed by Peter Liljeroth specializes on scanning tunneling microscopy (STM), which utilizes a tiny current between a sharp probe tip and a conducting sample to measure structural and electronic properties of the sample surface with atomic resolution. In this case, they used the STM to measure the current passing through a single molecule on a surface by injecting or removing electrons at different energies.
Within the molecule, electrons 'live' on so-called orbitals, which define their energy and the shape of their quantum mechanical wavefunction. These orbitals can be measured by recording the current through the molecule as a function of the applied voltage.
Fabian Schulz, a post-graduate researcher in Liljeroth's group, was surprised when the measurements on CoPC molecules did not fit the conventional interpretation of STM experiments on single molecules. "We saw several additional features in the recorded current where there should have been none according to the usual interpretation of these so-called tunneling spectra," Schulz explains.
The experiments were performed on cobalt phthalocyanine (CoPC) molecules deposited on a one-atom thick layer of hexagonal boron nitride on an iridium surface.
A colleague from Aalto University and leader of the Quantum Many-Body Physics group, Ari Harju, suggested that the key to understanding the experimental results might be a form of electron-electron interaction that usually is neglected in interpreting such experiments. In collaboration with Ari P. Seitsonen from the University of Zurich, Ari Harju and his team calculated the electronic properties of the molecule, including quantum mechanical effects that went beyond prevailing methods. This novel interpretation was confirmed when Liljeroth and his team were able to match the experimentally measured molecular orbitals with the predictions of the theory. "It was very exciting to see this kind of an interplay between theory and experiment," Liljeroth remarks.
Ari Harju concludes: "The proof that such theoretically predicted, exotic effects can be observed experimentally is an important step forward in understanding how current is transported across individual molecules and molecular assemblies."


Story Source:
The above story is based on materials provided by Aalto University. Note: Materials may be edited for content and length.


Journal Reference:
  1. Fabian Schulz, Mari Ijäs, Robert Drost, Sampsa K. Hämäläinen, Ari Harju, Ari P. Seitsonen, Peter Liljeroth. Many-body transitions in a single molecule visualized by scanning tunnelling microscopy. Nature Physics, 2015; DOI: 10.1038/nphys3212

Helicopter could be 'scout' for Mars rovers

A proposed helicopter could triple the distances that Mars rovers can drive in a Martian day and help pinpoint interesting targets for study

Getting around on Mars is tricky business. Each NASA rover has delivered a wealth of information about the history and composition of the Red Planet, but a rover's vision is limited by the view of onboard cameras, and images from spacecraft orbiting Mars are the only other clues to where to drive it. To have a better sense of where to go and what's worth studying on Mars, it could be useful to have a low-flying scout.

Enter the Mars Helicopter, a proposed add-on to Mars rovers of the future that could potentially triple the distance these vehicles currently drive in a Martian day, and deliver a new level of visual information for choosing which sites to explore.
The helicopter would fly ahead of the rover almost every day, checking out various possible points of interest and helping engineers back on Earth plan the best driving route.
Scientists could also use the helicopter images to look for features for the rover to study in further detail. Another part of the helicopter's job would be to check out the best places for the rover to collect key samples and rocks for a cache, which a next-generation rover could pick up later.
The vehicle is envisioned to weigh 2.2 pounds (1 kilogram) and measure 3.6 feet (1.1 meters) across from the tip of one blade to the other. The prototype body looks like a medium-size cubic tissue box.
The current design is a proof-of-concept technology demonstration that has been tested at NASA's Jet Propulsion Laboratory, Pasadena, California.


Story Source:
The above story is based on materials provided by NASA/Jet Propulsion Laboratory. Note: Materials may be edited for content and length.

3-D view of Greenland Ice Sheet opens window on ice history

This is a cross-section of the age of the Greenland Ice Sheet. Layers determined to be from the Holocene period, formed during the past 11,700 years, are shown in green. Layers accumulated during the last ice age, from 11,700 to 115,000 years ago, are shown in blue. Layers from the Eemian period, more than 115,000 years old, are shown in red. Regions of unknown age are gray

Scientists using ice-penetrating radar data collected by NASA's Operation IceBridge and earlier airborne campaigns have built the first comprehensive map of layers deep inside the Greenland Ice Sheet, opening a window on past climate conditions and the ice sheet's potentially perilous future.

This new map allows scientists to determine the age of large swaths of the second largest mass of ice on Earth, an area containing enough water to raise ocean levels by about 20 feet.
"This new, huge data volume records how the ice sheet evolved and how it's flowing today," said Joe MacGregor, the study's lead author, a glaciologist at The University of Texas at Austin Institute for Geophysics (UTIG), a unit of the Jackson School of Geosciences.
Greenland's ice sheet has been losing mass during the past two decades, a phenomenon accelerated by warming temperatures. Scientists are studying ice from different climate periods in the past to better understand how the ice sheet might respond in the future.
Ice cores offer one way of studying the distant past. These cylinders of ice drilled from the ice sheet hold evidence of past snow accumulation and temperature and contain impurities such as dust and volcanic ash compacted over hundreds of thousands of years. These layers are visible in ice cores and can be detected with ice-penetrating radar.
Ice-penetrating radar works by sending radar signals into the ice and recording the strength and return time of reflected signals. From those signals, scientists can detect the ice surface, sub-ice bedrock and layers within the ice.
New techniques used in this study allowed scientists to efficiently pick out these layers in radar data. Prior studies had mapped internal layers, but not at the scale made possible by these newer, faster methods.
Another major factor in this study was the scope of Operation IceBridge's measurements across Greenland, which included flights that covered distances of tens of thousands of kilometers across the ice sheet.
"IceBridge surveyed previously unexplored parts of the Greenland Ice Sheet and did it using state-of-the-art CReSIS radars," said study co-author Mark Fahnestock, an IceBridge science team member and glaciologist from the Geophysical Institute at the University of Alaska Fairbanks (UAF-GI).
CReSIS is the Center for Remote Sensing of Ice Sheets, a National Science Foundation science and technology center headquartered at the University of Kansas in Lawrence, Kansas.
IceBridge's flight lines often intersect ice core sites where other scientists have analyzed the ice's chemical composition to map and date layers in the ice. These core data provide a reference for radar measurements and provide a way to calculate how much ice from a given climate period exists across the ice sheet, something known as an age volume. Scientists are interested in knowing more about ice from the Eemian period, a time from 115,000 to 130,000 years ago that was about as warm as today. This new age volume provides the first data-driven estimate of where Eemian ice may remain.
Comparing this age volume to simple computer models helped the study's team better understand the ice sheet's history. Differences in the mapped and modeled age volumes point to past changes in ice flow or processes such as melting at the ice sheet's base. This information will be helpful for evaluating the more sophisticated ice sheet models that are crucial for projecting Greenland's future contribution to sea-level rise.
"Prior to this study, a good ice-sheet model was one that got its present thickness and surface speed right. Now, they'll also be able to work on getting its history right, which is important because ice sheets have very long memories," said MacGregor.



Story Source:
The above story is based on materials provided by University of Texas at Austin. Note: Materials may be edited for content and length.


Journal Reference:

  1. Joseph A. MacGregor, Mark A. Fahnestock, Ginny A. Catania, John D. Paden, S. Prasad Gogineni, S. Keith Young, Susan C. Rybarski, Alexandria N. Mabrey, Benjamin M. Wagman, Mathieu Morlighem. Radiostratigraphy and age structure of the Greenland Ice Sheet. Journal of Geophysical Research: Earth Surface, 2015; DOI: 10.1002/2014JF003215

Saturday 24 January 2015

Scientists slow down the speed of light travelling in free space

Light beams (stock illustration)

Scientists have long known that the speed of light can be slowed slightly as it travels through materials such as water or glass.

However, it has generally been thought impossible for particles of light, known as photons, to be slowed as they travel through free space, unimpeded by interactions with any materials.
In a new paper published in Science Express today (Friday 23 January), researchers from the University of Glasgow and Heriot-Watt University describe how they have managed to slow photons in free space for the first time. They have demonstrated that applying a mask to an optical beam to give photons a spatial structure can reduce their speed.
The team compare a beam of light, containing many photons, to a team of cyclists who share the work by taking it in turns to cycle at the front. Although the group travels along the road as a unit, the speed of individual cyclists can vary as they swap position.
The group formation can make it difficult to define a single velocity for all cyclists, and the same applies to light. A single pulse of light contains many photons, and scientists know that light pulses are characterised by a number of different velocities.
The team's experiment was configured like a time trial race, with two photons released simultaneously across identical distances towards a defined finish line. The researchers found that one photon reached the finish line as predicted, but the structured photon which had been reshaped by the mask arrived later, meaning it was travelling more slowly in free space. Over a distance of one metre, the team measured a slowing of up to 20 wavelengths, many times greater than the measurement precision.
The work demonstrates that, after passing the light beam through a mask, photons move more slowly through space. Crucially, this is very different to the slowing effect of passing light through a medium such as glass or water, where the light is only slowed during the time it is passing through the material -- it returns to the speed of light after it comes out the other side. The effect of passing the light through the mask is to limit the top speed at which the photons can travel.
The work was carried out by a team from the University of Glasgow's Optics Group, led by Professor Miles Padgett, working with theoretical physicists led by Stephen Barnett, and in partnership with Daniele Faccio from Heriot-Watt University.
Daniel Giovannini, one of the lead authors of the paper, said: "The delay we've introduced to the structured beam is small, measured at several micrometres over a propagation distance of one metre, but it is significant. We've measured similar effects in two different types of beams known as Bessel beams and Gaussian beams."
Co-lead author Jacquiline Romero said: "We've achieved this slowing effect with some subtle but widely-known optical principles. This finding shows unambiguously that the propagation of light can be slowed below the commonly accepted figure of 299,792,458 metres per second, even when travelling in air or vacuum.
"Although we measure the effect for a single photon, it applies to bright light beams too. The effect is biggest when the lenses used to create the beam are large and when the distance over which the light is focused is small, meaning the effect only applies at short range."
Professor Padgett added: "It might seem surprising that light can be made to travel more slowly like this, but the effect has a solid theoretical foundation and we're confident that our observations are correct.
"The results give us a new way to think about the properties of light and we're keen to continue exploring the potential of this discovery in future applications. We expect that the effect will be applicable to any wave theory, so a similar slowing could well be created in sound waves, for example."
The team's paper, titled 'Spatially Structured Photons that Travel in Free Space Slower than the Speed of Light', is published in Science Express, which provides electronic publication of selected papers in advance of print in the journal Science.


Story Source:
The above story is based on materials provided by University of Glasgow. Note: Materials may be edited for content and length.


Journal Reference:
  1. D. Giovannini, J. Romero, V. Poto ek, G. Ferenczi, F. Speirits, S. M. Barnett, D. Faccio, M. J. Padgett. Spatially structured photons that travel in free space slower than the speed of light. Science, 2015; DOI: 10.1126/science.aaa3035

How did the universe begin? Hot Big Bang or slow thaw?

Did the universe begin with a hot Big Bang or did it slowly thaw from an extremely cold and almost static state?

Did the universe begin with a hot Big Bang or did it slowly thaw from an extremely cold and almost static state? Prof. Dr. Christof Wetterich, a physicist at Heidelberg University, has developed a theoretical model that complements the nearly 100-year-old conventional model of cosmic expansion. According to Wetterich's theory, the Big Bang did not occur 13.8 billion years ago -- instead, the birth of the universe stretches into the infinite past. This view holds that the masses of all particles constantly increase. The scientist explains that instead of expanding, the universe is shrinking over extended periods of time.

Cosmologists usually call the birth of the universe the Big Bang. The closer we approach the Big Bang in time, the stronger the geometry of space and time curves. Physicists call this a singularity -- a term describing conditions whose physical laws are not defined. In the Big Bang scenario, the spacetime curvature becomes infinitely large. Shortly after the Big Bang, the universe was extremely hot and dense. Prof. Wetterich believes, however, that a different "picture" is also possible. If the masses of all elementary particles grow heavier over time and gravitational force weakens, the universe could have also had a very cold, slow start. In that view, the universe always existed and its earliest state was virtually static, with the Big Bang stretching over an infinitely long time in the past. The scientist from the Institute for Theoretical Physics assumes that the earliest "events" that are indirectly observable today came to pass 50 trillion years ago, and not in the billionth of a billionth of a billionth of a second after the Big Bang. "There is no longer a singularity in this new picture of the cosmos," says Prof. Wetterich.
His theoretical model explains dark energy and the early "inflationary universe" with a single scalar field that changes with time, with all masses increasing with the value of this field. "It's reminiscent of the Higgs boson recently discovered in Geneva. This elementary particle confirmed the physicists' assumption that particle masses do indeed depend on field values and are therefore variable," explains the Heidelberg scientist. In Wetterich's approach, all masses are proportional to the value of the so-called cosmon field, which increases in the course of cosmological evolution. "The natural conclusion of this model is a picture of a universe that evolved very slowly from an extremely cold state, shrinking over extended periods of time instead of expanding," explains Prof. Wetterich.
Wetterich stresses that this in no way renders the previous view of the Big Bang "invalid," however. "Physicists are accustomed to describing observed phenomena using different pictures." Light, for example, can be depicted as particles and as a wave. Similarly, his model can be seen as a picture equivalent to the Big Bang. "This is very useful for many practical predictions on the consequences that arise from this new theoretical approach. However, describing the 'birth' of the universe without a singularity does offer a number of advantages," emphasises Prof. Wetterich. "And in the new model, the nagging dilemma of 'there must have been something before the Big Bang' is no longer an issue."


Story Source:
The above story is based on materials provided by Heidelberg University. Note: Materials may be edited for content and length.


Journal References:
  1. C. Wetterich. Variable gravity Universe. Physical Review D, 2014; 89 (2) DOI: 10.1103/PhysRevD.89.024005
  2. C. Wetterich. Universe without expansion. Physics of the Dark Universe, 2013; 2 (4): 184 DOI: 10.1016/j.dark.2013.10.002

How the universe has cooled since the Big Bang fits Big Bang theory

Radio waves from a distant quasar pass through another galaxy on their way to Earth. Changes in the radio waves indicate the temperature of the gas


Astronomers using a CSIRO radio telescope have taken the Universe's temperature, and have found that it has cooled down just the way the Big Bang theory predicts.

Using the CSIRO Australia Telescope Compact Array near Narrabri, NSW, an international team from Sweden, France, Germany and Australia has measured how warm the Universe was when it was half its current age.
"This is the most precise measurement ever made of how the Universe has cooled down during its 13.77 billion year history," said Dr Robert Braun, Chief Scientist at CSIRO Astronomy and Space Science.
Because light takes time to travel, when we look out into space we see the Universe as it was in the past -- as it was when light left the galaxies we are looking at. So to look back half-way into the Universe's history, we need to look half-way across the Universe.
How can we measure a temperature at such a great distance?
The astronomers studied gas in an unnamed galaxy 7.2 billion light-years away [a redshift of 0.89].
The only thing keeping this gas warm is the cosmic background radiation -- the glow left over from the Big Bang.
By chance, there is another powerful galaxy, a quasar (called PKS 1830-211), lying behind the unnamed galaxy.
Radio waves from this quasar come through the gas of the foreground galaxy. As they do so, the gas molecules absorb some of the energy of the radio waves. This leaves a distinctive "fingerprint" on the radio waves.
From this "fingerprint" the astronomers calculated the gas's temperature. They found it to be 5.08 Kelvin (-267.92 degrees Celsius): extremely cold, but still warmer than today's Universe, which is at 2.73 Kelvin (-270.27 degrees Celsius).
According to the Big Bang theory, the temperature of the cosmic background radiation drops smoothly as the Universe expands. "That's just what we see in our measurements. The Universe of a few billion years ago was a few degrees warmer than it is now, exactly as the Big Bang Theory predicts," said research team leader Dr Sebastien Muller of Onsala Space Observatory at Chalmers University of Technology in Sweden.


Story Source:
The above story is based on materials provided by CSIRO Australia. Note: Materials may be edited for content and length.


Journal Reference:
  1. S. Muller , A. Beelen, J. H. Black, S. J. Curran, C. Horellou, S. Aalto, F. Combes, M. Guelin, C. Henkel. A precise and accurate determination of the cosmic microwave background temperature at z=0.89. Astronomy & Astrophysics, 2013 [link]

How does the universe creates reason, morality?

Solar system

Recent developments in science are beginning to suggest that the universe naturally produces complexity. The emergence of life in general and perhaps even rational life, with its associated technological culture, may be extremely common, argues Clemson researcher Kelly Smith in a recently published paper in the journal Space Policy.

What's more, he suggests, this universal tendency has distinctly religious overtones and may even establish a truly universal basis for morality.
Smith, a Philosopher and Evolutionary Biologist, applies recent theoretical developments in Biology and Complex Systems Theory to attempt new answers to the kind of enduring questions about human purpose and obligation that have long been considered the sole province of the humanities.
He points out that scientists are increasingly beginning to discuss how the basic structure of the universe seems to favor the creation of complexity. The large scale history of the universe strongly suggests a trend of increasing complexity: disordered energy states produce atoms and molecules, which combine to form suns and associated planets, on which life evolves. Life then seems to exhibit its own pattern of increasing complexity, with simple organisms getting more complex over evolutionary time until they eventually develop rationality and complex culture.
And recent theoretical developments in Biology and complex systems theory suggest this trend may be real, arising from the basic structure of the universe in a predictable fashion.
"If this is right," says Smith, "you can look at the universe as a kind of 'complexity machine', which raises all sorts of questions about what this means in a broader sense. For example, does believing the universe is structured to produce complexity in general, and rational creatures in particular, constitute a religious belief? It need not imply that the universe was created by a God, but on the other hand, it does suggest that the kind of rationality we hold dear is not an accident."
And Smith feels another similarity to religion are the potential moral implications of this idea. If evolution tends to favor the development of sociality, reason, and culture as a kind of "package deal," then it's a good bet that any smart extraterrestrials we encounter will have similar evolved attitudes about their basic moral commitments.
In particular, they will likely agree with us that there is something morally special about rational, social creatures. And such universal agreement, argues Smith, could be the foundation for a truly universal system of ethics.
Smith will soon take sabbatical to lay the groundwork for a book exploring these issues in more detail.


Story Source:
The above story is based on materials provided by Clemson University. Note: Materials may be edited for content and length.


Journal Reference:
  1. Kelly C. Smith. Manifest complexity: A foundational ethic for astrobiology? Space Policy, 2014; 30 (4): 209 DOI: 10.1016/j.spacepol.2014.10.004

Yes, black holes exist in gravitational theories with unbounded speeds of propagation

Lorentz invariance (LI) is a cornerstone of modern physics, and strongly supported by observations.
In fact, all the experiments carried out so far are consistent with it, and no evidence to show that such a symmetry needs to be broken at a certain energy scale. Nevertheless, there are various reasons to construct gravitational theories with broken LI. In particular, our understanding of space-times at Plank scale is still highly limited, and the renomalizability and unitarity of gravity often lead to the violation of LI.
One concrete example is the Horava theory of quantum gravity, in which the LI is broken in the ultraviolet (UV), and the theory can include higher-dimensional spatial derivative operators, so that the UV behavior is dramatically improved and can be made (power-counting) renormalizable.
On the other hand, the exclusion of high-dimensional time derivative operators prevents the ghost instability, whereby the unitarity of the theory -- a problem that has been faced since 1977 [ K.S. Stelle, Phys. Rev. D16, 953 (1977)] -- is assured. In the infrared (IR) the lower dimensional operators take over, whereby a healthy low-energy limit is presumably resulted.
However, once LI is broken different species of particles can travel with different velocities, and in certain theories , such as the Horava theory mentioned above, they can be even arbitrarily large. This suggests that black holes may not exist at all in such theories, as any signal initially trapped inside a horizon can penetrate it and propagate to infinity, as long as the signal has sufficiently large velocity (or energy). This seems in a sharp conflict with current observations, which strongly suggest that black holes exist in our universe [R. Narayan and J.E. MacClintock, Mon. Not. R. Astron. Soc., 419, L69 (2012)].
A potential breakthrough was made recently by Blas and Sibiryakov [D. Blas and S. Sibiryakov, Phys. Rev. D84, 124043 (2011)], who found that there still exist absolute causal boundaries, the so-called universal horizons, and particles even with infinitely large velocities would just move around on these boundaries and cannot escape to infinity.
This has immediately attracted lot of attention. In particular, it was shown that the universal horizon radiates like a blackbody at a fixed temperature, and obeys the first law of black hole mechanics [P. Berglund, J. Bhattacharyya, and D. Mattingly, Phys. Rev. D85, 124019 (2012); Phys. Rev. Lett. 110, 071301 (2013)]. The main idea is as follows: In a given space-time, a globally timelike foliation parametrized by a scalar field, the so-called khronon, might exist.
Then, there is a surface at which the khronon diverges, while physically nothing singular happens there, including the metric and the space-time. Given that the khronon defines an absolute time, any object crossing this surface from the interior would necessarily also move back in absolute time, which is something forbidden by the definition of the causality of the theory. Thus, even particles with superluminal velocities cannot penetrate this surface, once they are trapped inside it.
In all studies of universal horizons carried out so far the khronon is part of the gravitational theory involved. To generalize the conception of the universal horizons to any gravitational theory with broken LI, recently Lin, Abdalla, Cai and Wang promoted the khronon to a test field, a similar role played by a Killing vector, so its existence does not affect the given space-time, but defines the properties of it.
By this way, such a field is no longer part of the underlaid gravitational theory and it may or may not exist in a given space-time, depending on the properties of the space-time considered. Then, they showed that the universal horizons indeed exist, by constructing concrete static charged solutions of the Horava gravity. More important, they showed that such horizons exist not only in the IR limit of the theory, as has been considered so far in the literature, but also in the full Horava theory of gravity, that is, when high-order operators are not negligible.

Story Source:
The above story is based on materials provided by World Scientific. Note: Materials may be edited for content and length.

Journal Reference:
  1. Kai Lin, Elcio Abdalla, Rong-Gen Cai, Anzhon Wang. Universal horizons and black holes in gravitational theories with broken Lorentz symmetry. International Journal of Modern Physics D, 2014; 1443004 DOI: 10.1142/S0218271814430044

Friday 23 January 2015

Huge 3-D displays without 3-D glasses

Billboards of the future could show astonishing 3D effects, thanks to a new technology developed in Austria


A new invention opens the door to a new generation of outdoor displays. Different pictures can be seen at different angles, creating 3D effects without the need for 3D glasses.

Public screenings have become an important part of major sports events. In the future, we will be able to enjoy them in 3D, thanks to a new invention from Austrian scientists. A sophisticated laser system sends laser beams into different directions. Therefore, different pictures are visible from different angles. The angular resolution is so fine that the left eye is presented a different picture than the right one, creating a 3D effect.
In 2013, the young start-up company TriLite Technologies had the idea to develop this new kind of display, which sends beams of light directly to the viewers' eyes. The highly interdisciplinary project was carried out together with the Vienna University of Technology.
Together, TriLite and TU Vienna have created the first prototype. Currently it only has a modest resolution of five pixels by three, but it clearly shows that the system works. "We are creating a second prototype, which will display colour pictures with a higher resolution. But the crucial point is that the individual laser pixels work. Scaling it up to a display with many pixels is not a problem," says Jörg Reitterer (TriLite Technologies and PhD-student in the team of Professor Ulrich Schmid at the Vienna University of Technology).
Every single 3D-Pixel (also called "Trixel") consists of lasers and a moveable mirror. "The mirror directs the laser beams across the field of vision, from left to right. During that movement the laser intensity is modulated so that different laser flashes are sent into different directions," says Ulrich Schmid. To experience the 3D effect, the viewer must be positioned in a certain distance range from the screen. If the distance is too large, both eyes receive the same image and only a normal 2D picture can be seen. The range in which the 3D effect can be experienced can be tuned according to the local requirements.
Hundreds of Images at Once
3D movies in the cinema only show two different pictures -- one for each eye. The newly developed display, however, can present hundreds of pictures. Walking by the display, one can get a view of the displayed object from different sides, just like passing a real object. For this, however, a new video format is required, which has already been developed by the researchers. "Today's 3D cinema movies can be converted into our 3D format, but we expect that new footage will be created especially for our displays -- perhaps with a much larger number of cameras," says Franz Fiedler, CTO of TriLite Technologies.
Compared to a movie screen, the display is very vivid. Therefore it can be used outdoors, even in bright sunlight. This is not only interesting for 3D-presentations but also for targeted advertisements. Electronic Billboards could display different ads, seen from different angles. "Maybe someone wants to appeal specifically to the customers leaving the shop across the street, and a different ad is shown to the people waiting at the bus stop," says Ferdinand Saint-Julien, CEO of TriLite Technologies. Technologically, this would not be a problem.
Entering the market
"We are very happy that the project was so successful in such a short period of time," says Ulrich Schmid. It took only three years to get from the first designs to a working prototype. The technology has now been patented and presented in several scientific publications. The second prototype should be finished by the middle of the year, the commercial launch is scheduled for 2016.


Story Source:
The above story is based on materials provided by Vienna University of Technology. Note: Materials may be edited for content and length.


Journal Reference:
  1. Jörg Reitterer, Franz Fidler, Gerhard Schmid, Thomas Riel, Christian Hambeck, Ferdinand Saint Julien-Wallsee, Walter Leeb, Ulrich Schmid. Design and evaluation of a large-scale autostereoscopic multi-view laser display for outdoor applications. Optics Express, 2014; 22 (22): 27063 DOI: 10.1364/OE.22.027063

Decoding the gravitational evolution of dark matter halos

Researchers at Kavli IPMU and their collaborators have revealed that considering environmental effects such as a gravitational tidal force spread over a scale much larger than a galaxy cluster is indispensable to explain the distribution and evolution of dark matter halos around galaxies. A detailed comparison between theory and simulations made this work possible. The results of this study, which are published in Physical Review D as an Editors' Suggestion, contribute to a better understanding of fundamental physics of the universe.
In the standard scenario for the formation of a cosmic structure, dark matter, which has an energy budget in the universe that is approximately five times greater than ordinary matter (e.g., atoms), first gathers gravitationally to form a crowded region, the so-called dark matter halos. Then these dark matter halos attract atomic gas and eventually form stars and galaxies. Hence, to extract cosmological information from a three-dimensional galaxy map observed in SDSS BOSS, the SuMIRe project, etc., it is important to understand how clustering of dark matter halos has gravitationally evolved throughout cosmic history. (This is referred to as the halo bias problem.)
"Various studies have described the halo bias theoretically," said Teppei Okumura, a project researcher involved in the study from Kavli IPMU. "However, none of them reproduced simulation results well. So, we extended prior studies motivated by a mathematical symmetry argument and examined if our extension works."
The authors demonstrate that higher-order nonlocal terms originating from environmental effects such as gravitational tidal force must be taken into account to explain the halo bias in simulations. They also confirm that the size of the effect agrees well with a simple theoretical prediction.
"The results of our study allow the distribution of dark matter halos to be more accurately predicted by properly taking into account higher-order terms missed in the literature," said Shun Saito, the principal investigator of the study from Kavli IPMU. "Our refined model has been already applied to actual data analysis in the BOSS project. This study certainly improves the measurement of the nature of dark energy or neutrino masses. Hence, it has led to a better understanding of the fundamental physics of the universe."
This study is supported by a Grant-in-Aid from the Japan Society for the Promotion of Science (JSPS) No. 25887012.

Story Source:
The above story is based on materials provided by Kavli Institute for the Physics and Mathematics of the Universe. Note: Materials may be edited for content and length.

Journal Reference:
  1. Shun Saito, Tobias Baldauf, Zvonimir Vlah, Uroš Seljak, Teppei Okumura, Patrick McDonald. Understanding higher-order nonlocal halo bias at large scales by combining the power spectrum with the bispectrum. Physical Review D, 2014; 90 (12) DOI: 10.1103/PhysRevD.90.123522