mercredi 14 septembre 2016

Know Thy Star, Know Thy Planet












NASA - Kepler Mission patch.

Sept. 14, 2016

When it comes to exoplanets, astronomers have realized that they only know the properties of the planets they discover as well as they know the properties of the stars being orbited. For a planet's size, precisely characterizing the host star can mean the difference in our understanding of whether a distant world is small like Earth or huge like Jupiter.

For astronomers to determine the size of an exoplanet—planets outside the solar system—depends critically on knowing not only the radius of its host star but also whether that star is single or has a close companion. Consider that about half of the stars in the sky are not one but two stars orbiting around each other, this makes knowing the binary property of a star paramount.

Artist's view of TRAPPIST-1. Image Credit: NASA

One particularly interesting and relatively nearby star, named TRAPPIST-1, recently caught the attention of a team of researchers. They wanted to determine if TRAPPIST-1, which is home to three small, potentially rocky planets—one of which orbits in the temperate habitable zone where liquid water might pool on the surface—was a single star like the sun, or if it had a companion star. If TRAPPIST-1 did have a companion star, the discovered planets will have larger sizes, possibly large enough to be ice giants similar to Neptune.

If an exoplanet orbits a star in a binary system but astronomers believe the starlight captured by the telescope is from a single star, the real radius of the planet will be larger than measured. The difference in the measured size of the exoplanet can be small ranging from 10 percent to more than a factor of two in size, depending on the brightness of the companion star in the system.

To confirm or deny the single star nature of TRAPPIST-1, Steve Howell, senior research scientist at NASA's Ames Research Center at Moffett Field, California, led an investigation of the star. Using a specially designed camera, called the Differential Speckle Survey Instrument or DSSI, Howell and his team measured the rapid disturbances in the light emitted by the star caused by the Earth’s atmosphere and corrected for them. The resultant high-resolution image revealed that the light coming from the TRAPPIST-1 system is from a single star.

With the confirmation that no other companion star resides in the vicinity of TRAPPIST-1, the research team's result validates not only that transiting planets are responsible for the periodic dips seen in the star’s brightness but that they are indeed Earth-size and may likely to be rocky worlds.

"Knowing that a terrestrial-size potentially rocky planet orbits in the habitable zone of a star only 40 light-years from the Earth is an awesome finding," said Howell. “The TRAPPIST-1 system will continue to be studied in great detail as these transiting exoplanets offer one of the best chances to characterize the atmosphere of an alien world."

Mounted on the 8-meter Gemini Observatory South telescope in Chile, the DSSI provided astronomers with the highest resolution images available today from a single ground-based telescope. The nearness of TRAPPIST-1 allowed astronomers to peer deep into the system, looking closer than Mercury's orbit to our sun.

The paper the result is based on is published in the September 13th issue of The Astrophysical Journal Letters.

Interest in the recently-discovered TRAPPIST-1 with its three Earth-size planets is high. Astronomically speaking, at 40 light-years from Earth, the system is a hop, skip and a jump away. The star itself is a dim M-type star, which, relative to most stars, is very small and cool, but making transit detection of small planets easier.

Further detailed measurement of the planetary transits seen in TRAPPIST-1 will begin later this year when NASA's Kepler space telescope in its K2 mission will precisely monitor minute changes in the light emitted from the star for a period of about 75 days.

The space-based observations from the Kepler spacecraft will provide extremely precise measurements of the planet transit shapes allowing for more refined radius and orbital period determination. Noting variations in the mid-time of the transit events can also help astronomers determine the planet masses. Additionally, the new observations will be searched for more transiting planets in the TRAPPIST-1 system.

Speckle interferometry, the imaging technique used by the DSSI, is a powerful asset in the astronomer's toolkit as it provides a unique capability to characterize the environment around distant stars. The technique provides ultra high-resolution images by taking multiple extremely short (40-60 millisecond) exposures of a star to capture fine detail in the received light and “freeze” the turbulence caused by Earth’s atmosphere.

By combining the many thousands of exposures and using mathematical techniques to remove the momentary distortions caused by Earth’s atmosphere, the final result provides a resolution equal to the theoretical limit of what the 8-meter Gemini telescope would produce if no atmosphere were present.


Image above: The four-panel graphic illustrates the difference of measured starlight when seen through a ground-based telescope with (top left corner) and without the blurring effects caused by Earth's atmosphere. The technique to neutralize Earth's atmospheric blur is called speckle interferometry. All four images are shown at the same scale. Image Credits: Gemini Observatory/AURA and NASA/Ames/W. Stenzel.

Howell and his team at NASA Ames are currently undertaking the construction of two new speckle interferometric instruments. One of the new instruments will be delivered this fall to the 3.5-meter WIYN telescope located at Kitt Peak National Observatory outside of Tucson, Arizona, where it will be used by the NN_EXPLORE guest observer research program. The other is being developed for the Gemini Observatory North telescope located on Mauna Kea in Hawaii.

NASA Ames manages the Kepler and K2 missions for NASA's Science Mission Directorate. NASA's Jet Propulsion Laboratory in Pasadena, California, managed Kepler mission development. Ball Aerospace & Technologies Corporation operates the flight system with support from the Laboratory for Atmospheric and Space Physics at the University of Colorado at Boulder.

Related link:

NN_EXPLORE guest observer research program: https://exoplanets.nasa.gov/exep/NNExplore/GuestObserver/

To learn more about the result from the Gemini Observatory, visit: http://www.gemini.edu/node/12567.

For more information on the Kepler and the K2 mission, visit: http://www.nasa.gov/kepler.

Images (mentioned), Text, Credits: NASA/Ames Research Center/Michele Johnson.

Greetings, Orbiter.ch

Fighting Cancer with Space Research












JPL - Jet Propulsion Laboratory logo.

September 14, 2016

JPL and National Cancer Institute Renew Big Data Partnership

Every day, NASA spacecraft beam down hundreds of petabytes of data, all of which has to be codified, stored and distributed to scientists across the globe. Increasingly, artificial intelligence is helping to "read" this data as well, highlighting similarities between datasets that scientists might miss.

For the past 15 years, the big data techniques pioneered by NASA's Jet Propulsion Laboratory in Pasadena, California, have been revolutionizing biomedical research. On Sept. 6, 2016, JPL and the National Cancer Institute (NCI), part of the National Institutes of Health, renewed a research partnership through 2021, extending the development of data science that originated in space exploration and is now supporting new cancer discoveries.

The NCI-supported Early Detection Research Network (EDRN) is a consortium of biomedical investigators who share anonymized data on cancer biomarkers, chemical or genetic signatures related to specific cancers. Their goal is to pool all their research data into a single, searchable network, with the goal of translating their collective work into techniques for early diagnosis of cancer or cancer risk.


Image above: NGC 3718, NGC 3729 and other galaxies have been analyzed using machine learning algorithms that can be "taught" to recognize astrophysical similarities. The same technology is now being applied to cancer images, as well. Image Credits: Catalina Sky Survey, U of Arizona, and Catalina Realtime Transient Survey, Caltech.

In the time they've worked together, JPL and EDRN's efforts have led to the discovery of six new Food and Drug Administration-approved cancer biomarkers and nine biomarkers approved for use in Clinical Laboratory Improvement Amendments labs. The FDA has approved each of these biomarkers for use in cancer research and diagnosis. These agency-approved biomarkers have been used in more than 1 million patient diagnostic tests worldwide.

"After the founding of EDRN in 2000, the network needed expertise to take data from multiple studies on cancer biomarkers and create a single, searchable network of research findings for scientists," said Sudhir Srivastava, chief of NCI's Cancer Biomarkers Research Group and head of EDRN. JPL had decades of experience doing similar work for NASA, where spacecraft transmit hundreds of petabytes of data to be coded, stored and distributed to scientists across the globe.

Dan Crichton, the head of JPL's Center for Data Science and Technology, a joint initiative with Caltech in Pasadena, California, helped establish a JPL-based informatics center dedicated to supporting EDRN's big data efforts. In the renewed partnership, JPL is expanding its data science efforts to research and applying technologies for additional NCI-funded programs. Those programs include EDRN, the Consortium for Molecular and Cellular Characterization of Screen-Detected Lesions, and the Informatics Technology for Cancer Research initiative.

"From a NASA standpoint, there are significant opportunities to develop new data science capabilities that can support both the mission of exploring space and cancer research using common methodological approaches," Crichton said. "We have a great opportunity to perfect those techniques and grow JPL's data science technologies, while serving our nation.

Crichton said JPL has led the way when it comes to taking data from raw observations to scientific conclusions. One example: JPL often deals with measurements from a variety of sensors -- say, cameras and mass spectrometers. Both can be used to study a star, planet or similar target object. But it takes special software to recognize that readings from very different instruments relate to one another.

There's a similar problem in cancer research, where readings from different biomedical tests or instruments require correlation with one another. For that to happen, data have to be standardized, and algorithms must be "taught" to know what they're looking for.


Image above: A lung specimen that was analyzed using the same machine learning algorithms that were originally developed for space research. Image Credits: Early Research Detection Network/University of Colorado.

Since the time of its founding, EDRN's major challenge has been access. Research centers all over the United States had large numbers of biomarker specimens, but each had its own way of labeling, storing and sharing their datasets. Ten sites may have high-quality specimens for study, but if their common data elements -- age of patient, cancer type and other characteristics - aren't listed uniformly, they can't be studied as a whole.

"We didn't know if they were early-stage or late-stage specimens, or if any level of treatment had been tried," Srivastava said. "And JPL told us, 'We do this type of thing all the time! That's how we manage our Planetary Data System.'"

As the network has developed, it has added members from dozens of institutions, including Dartmouth College's Geisel School of Medicine; Harvard Medical School's Massachusetts General Hospital; Stanford's NIST Genome-Scale Measurements Group; University of Texas' MD Anderson Cancer Center; and numerous others.

Christos Patriotis, program director at NCI's Cancer Biomarkers Research Group, said the network's members now include international researchers from the U.K., China, Japan, Australia, Israel and Chile.

"The more we expand, the more data we integrate," Patriotis said. "Instead of being silos, now our partners can integrate their findings. Each system can speak to the others."

As JPL and NCI's collaboration advances, next steps include image recognition technology, such as helping EDRN archive images of cancer specimens. Those images could be analyzed by computer vision, which is currently used to spot similarities in star clusters and other astrophysics research.

In the near future, Crichton said, machine learning algorithms could compare a CT scan with an archive of similar images, searching for early signs of cancer based on a patient's age, ethnic background and other demographics.

"As we develop more automated methods for detecting and classifying features in images, we see great opportunities for enhancing data discovery," Crichton said. "We have examples where algorithms for detection of features in astronomy images have been transferred to biology and vice-versa."

Related link:

Early Detection Research Network (EDRN): https://edrn.nci.nih.gov/

JPL-based informatics center: http://cancer.jpl.nasa.gov/

For more information on the research, visit: http://edrn.cancer.gov

Caltech manages JPL for NASA.

Images (mentioned), Text, Credits: NASA/JPL/Andrew Good.

Greetings, Orbiter.ch

Gaia’s billion-star map hints at treasures to come












ESA - Gaia Mission patch.

14 September 2016

The first catalogue of more than a billion stars from ESA’s Gaia satellite was published today – the largest all-sky survey of celestial objects to date.

On its way to assembling the most detailed 3D map ever made of our Milky Way galaxy, Gaia has pinned down the precise position on the sky and the brightness of 1142 million stars.

Gaia’s first sky map

As a taster of the richer catalogue to come in the near future, today’s release also features the distances and the motions across the sky for more than two million stars.

“Gaia is at the forefront of astrometry, charting the sky at precisions that have never been achieved before,” says Alvaro Giménez, ESA’s Director of Science.

“Today’s release gives us a first impression of the extraordinary data that await us and that will revolutionise our understanding of how stars are distributed and move across our Galaxy.”

Gaia mapping the stars of the Milky Way

Launched 1000 days ago, Gaia started its scientific work in July 2014. This first release is based on data collected during its first 14 months of scanning the sky, up to September 2015.

“The beautiful map we are publishing today shows the density of stars measured by Gaia across the entire sky, and confirms that it collected superb data during its first year of operations,” says Timo Prusti, Gaia project scientist at ESA.

The stripes and other artefacts in the image reflect how Gaia scans the sky, and will gradually fade as more scans are made during the five-year mission.

“The satellite is working well and we have demonstrated that it is possible to handle the analysis of a billion stars. Although the current data are preliminary, we wanted to make them available for the astronomical community to use as soon as possible,” adds Dr Prusti.

Transforming the raw information into useful and reliable stellar positions to a level of accuracy never possible before is an extremely complex procedure, entrusted to a pan-European collaboration of about 450 scientists and software engineers: the Gaia Data Processing and Analysis Consortium, or DPAC.

“Today’s release is the result of a painstaking collaborative work over the past decade,” says Anthony Brown from Leiden University in the Netherlands, and consortium chair.

“Together with experts from a variety of disciplines, we had to prepare ourselves even before the start of observations, then treated the data, packaged them into meaningful astronomical products, and validated their scientific content.”

Gaia scanning the sky

In addition to processing the full billion-star catalogue, the scientists looked in detail at the roughly two million stars in common between Gaia’s first year and the earlier Hipparcos and Tycho-2 Catalogues, both derived from ESA’s Hipparcos mission, which charted the sky more than two decades ago.

By combining Gaia data with information from these less precise catalogues, it was possible to start disentangling the effects of ‘parallax’ and ‘proper motion’ even from the first year of observations only. Parallax is a small motion in the apparent position of a star caused by Earth’s yearly revolution around the Sun and depends on a star’s distance from us, while proper motion is due to the physical movement of stars through the Galaxy.

In this way, the scientists were able to estimate distances and motions for the two million stars spread across the sky in the combined Tycho–Gaia Astrometric Solution, or TGAS.

This new catalogue is twice as precise and contains almost 20 times as many stars as the previous definitive reference for astrometry, the Hipparcos Catalogue.

As part of their work in validating the catalogue, DPAC scientists have conducted a study of open stellar clusters – groups of relatively young stars that were born together – that clearly demonstrates the improvement enabled by the new data.

Galaxies, open and globular clusters in Gaia's sky map

“With Hipparcos, we could only analyse the 3D structure and dynamics of stars in the Hyades, the nearest open cluster to the Sun, and measure distances for about 80 clusters up to 1600 light-years from us,” says Antonella Vallenari from the Istituto Nazionale di Astrofisica (INAF) and the Astronomical Observatory of Padua, Italy.

“But with Gaia’s first data, it is now possible to measure the distances and motions of stars in about 400 clusters up to 4800 light-years away.

For the closest 14 open clusters, the new data reveal many stars surprisingly far from the centre of the parent cluster, likely escaping to populate other regions of the Galaxy.”

Many more stellar clusters will be discovered and analysed in even greater detail with the extraordinary data that Gaia continues to collect and that will be released in the coming years.

From the Solar System to the Hyades cluster

The new stellar census also contains 3194 variable stars, stars that rhythmically swell and shrink in size, leading to periodic brightness changes.

Many of the variables seen by Gaia are in the Large Magellanic Cloud, one of our galactic neighbours, a region that was scanned repeatedly during the first month of observations, allowing accurate measurement of their changing brightness.

Details about the brightness variations of these stars, 386 of which are new discoveries, are published as part of today’s release, along with a first study to test the potential of the data.

“Variable stars like Cepheids and RR Lyraes are valuable indicators of cosmic distances,” explains Gisella Clementini from INAF and the Astronomical Observatory of Bologna, Italy.

“While parallax is used to measure distances to large samples of stars in the Milky Way directly, variable stars provide an indirect, but crucial step on our ‘cosmic distance ladder’, allowing us to extend it to faraway galaxies.”

This is possible because some kinds of variable stars are special. For example, in the case of Cepheid stars, the brighter they are intrinsically, the slower their brightness variations. The same is true for RR Lyraes when observed in infrared light. The variability pattern is easy to measure and can be combined with the apparent brightness of a star to infer its true brightness.

This is where Gaia steps in: in the future, scientists will be able to determine very accurate distances to a large sample of variable stars via Gaia's measurements of parallaxes. With those, they will calibrate and improve the relation between the period and brightness of these stars, and apply it to measure distances beyond our Galaxy. A preliminary application of data from the TGAS looks very promising.

“This is only the beginning: we measured the distance to the Large Magellanic Cloud to test the quality of the data, and we got a sneak preview of the dramatic improvements that Gaia will soon bring to our understanding of cosmic distances,” adds Dr Clementini.

Pluto occultation

Knowing the positions and motions of stars in the sky to astonishing precision is a fundamental part of studying the properties and past history of the Milky Way and to measure distances to stars and galaxies, but also has a variety of applications closer to home – for example, in the Solar System.

In July, Pluto passed in front of a distant, faint star, offering a rare chance to study the atmosphere of the dwarf planet as the star gradually disappeared and then reappeared behind Pluto.

This stellar occultation was visible only from a narrow strip stretching across Europe, similar to the totality path that a solar eclipse lays down on our planet’s surface. Precise knowledge of the star’s position was crucial to point telescopes on Earth, so the exceptional early release of the Gaia position for this star, which was 10 times more precise than previously available, was instrumental to the successful monitoring of this rare event.

Early results hint at a pause in the puzzling pressure rise of Pluto’s tenuous atmosphere, something that has been recorded since 1988 in spite of the dwarf planet moving away from the Sun, which would suggest a drop in pressure due to cooling of the atmosphere.

“These three examples demonstrate how Gaia’s present and future data will revolutionise all areas of astronomy, allowing us to investigate our place in the Universe, from our local neighbourhood, the Solar System, to Galactic and even grander, cosmological scales,” explains Dr Brown.

This first data release shows that the mission is on track to achieve its ultimate goal: charting the positions, distances, and motions of one billion stars – about 1% of the Milky Way’s stellar content – in three dimensions to unprecedented accuracy.

“The road to today has not been without obstacles: Gaia encountered a number of technical challenges and it has taken an extensive collaborative effort to learn how to deal with them,” says Fred Jansen, Gaia mission manager at ESA.

“But now, 1000 days after launch and thanks to the great work of everyone involved, we are thrilled to present this first dataset and are looking forward to the next release, which will unleash Gaia’s potential to explore our Galaxy as we've never seen it before.”

Notes for Editors:

The data from Gaia’s first release can be accessed at http://archives.esac.esa.int/gaia

The content of this first release was presented today during a media briefing at ESA’s European Space Astronomy Centre (ESAC) in Villanueva de la Cañada, Madrid, Spain.

Fifteen scientific papers describing the data contained in the release and their validation process will appear in a special issue of Astronomy & Astrophysics: http://www.aanda.org/component/toc/?task=topic&id=641

Gaia is an ESA mission to survey one billion stars in our Galaxy and local galactic neighbourhood in order to build the most precise 3D map of the Milky Way and answer questions about its structure, origin and evolution.

A large pan-European team of expert scientists and software developers, the Data Processing and Analysis Consortium, located in and funded by many ESA member states, is responsible for the processing and validation of Gaia’s data, with the final objective of producing the Gaia Catalogue. Scientific exploitation of the data will only take place once they are openly released to the community.

Members of the consortium come from 20 European countries (Austria, Belgium, Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Netherlands, Poland, Portugal, Slovenia, Spain, Switzerland, Sweden and the UK) as well as from further afield (Algeria, Brazil, Israel and the US).

In addition, ESA makes a significant contribution to the consortium in the form of the Data Processing Centre at ESAC, which, among other tasks and responsibilities, acts as the central hub for all Gaia data processing.

Related links:

Gaia: http://www.esa.int/Our_Activities/Space_Science/Gaia

Gaia overview: http://www.esa.int/Our_Activities/Space_Science/Gaia/Gaia_overview

Gaia factsheet: http://www.esa.int/Our_Activities/Space_Science/Gaia/Gaia_factsheet

Frequently asked questions: http://www.esa.int/Our_Activities/Space_Science/Gaia/Frequently_Asked_Questions_about_Gaia

Gaia brochure: http://www.esa.int/About_Us/ESA_Publications/ESA_BR-296_Gaia_ESA_s_galactic_census

Images, Videos, Text, Credits: ESA/Gaia/DPAC/ATG medialab; background: ESO/S. Brunier/B. Sicardy (LESIA, Observatoire de Paris, France), P. Tanga (Observatoire de la Côte d'Azur, Nice, France), A. Carbognani (Osservatorio Astronomico Valle d'Aosta, Italy), Rodrigo Leiva (LESIA, Observatoire de Paris)/Video: ESA/Gaia/DPAC; acknowledgement: S. Jordan & T. Sagristà Sellés (Zentrum für Astronomie der Universität Heidelberg).

Best regards, Orbiter.ch

mardi 13 septembre 2016

After Strong El Niño Winter, NASA Model Sees Return to Normal












NASA - Goddard Space Flight Center logo.

Sept. 13, 2016

Not too hot, not too cold – instead, water temperatures in the equatorial Pacific Ocean should be just around normal for the rest of 2016, according to forecasts from the Global Modeling and Assimilation Office, or GMAO. With these neutral conditions, scientists with the modeling center at NASA’s Goddard Space Flight Center say there is unlikely to be a La Niña event in late 2016.

Last winter saw an extremely strong El Niño event, in which warmer-than-normal water sloshed toward the eastern Pacific Ocean. Historically, some of the larger El Niño events are followed by a La Niña event, in which deep, colder-than-normal water surfaces in the eastern Pacific Ocean, off the coast of South America.


Animation above: Sea surface temperature patterns of the 2015 El Niño in the Pacific Ocean unfolded differently than those seen in the 1997-1998 El Niño. Animation Credits: NASA.

"We are consistently predicting a more neutral state, with no La Niña or El Niño later this year," said Steven Pawson, chief of the GMAO. "Our September forecast continues to show the neutral conditions that have been predicted since the spring."

As part of a research and development project, GMAO contributes experimental seasonal forecasts each month to the North American Multi-Model Ensemble (NMME) and other centers. MME produces a forecast by combining the individual forecasts of a number of participating institutions, which helps to reduce the uncertainty involved in forecasting events nine to twelve months in advance. The NMME prediction system delivers forecasts based on the National Oceanic and Atmospheric Administration (NOAA) operational schedule and is used by many operational forecasters in predicting El Niño and La Niña events.

For GMAO, the seasonal forecasts are one way to use NASA satellite data to improve near-term climate predictions of the Earth system.

"We’re really trying to bring as much NASA observational data as possible into these systems," Pawson said.

The scientists with GMAO feed a range of NASA satellite data and other information into the seasonal forecast model to predict if an El Niño or La Niña event will occur in the nine months – information on the aerosols and ozone in the atmosphere, sea ice, winds, sea surface heights and temperatures, and more. The models are run on supercomputers at the NASA Center for Climate Simulation – 9 terabytes of data each month.

For much of this spring and summer, however, the Goddard group’s forecast of neutral conditions looked like an outlier. Most other forecasts originally called for a La Niña event, but then shifted to more neutral outlooks in August. But the GMAO forecasts produced in January 2016, which look nine months ahead, saw the Pacific Ocean reverting to normal temperatures after last year’s El Niño, and even getting a little colder than normal. Still, the water wouldn’t get cold enough to be considered a La Niña, according to the GMAO forecasts.

It’s not the first time in recent memory that GMAO was an outlier. "The big El Niño that peaked in November 2015, we actually began forecasting that back in March, and our forecast was in excellent agreement with the real event," said Robin Kovach, a research scientist at GMAO. While the strength of the 2015-2016 El Niño predicted by the model seemed at first to be excessive, it was borne out in subsequent observations.

The GMAO models aren’t always right, though, Kovach said. In 2014 the group forecast a large El Niño that didn’t materialize.

"There’s a fair degree of uncertainty when you start predicting for nine months ahead," Pawson said. But the group is constantly upgrading their systems, and is currently working to improve the resolution and bring in new types of satellite observations, such as soil moisture information from the Soil Moisture Active Passive mission, which launched in 2015.

GMAO scientists are also investigating how to incorporate observations of ocean color into the seasonal forecast model. Shades of green can tell researchers about how much phytoplankton is in a region, which in turn can provide information about fish populations.

"So if there’s another big El Niño in five years or so, we could be able to do online predictions of phytoplankton," he said, "and help fishermen predict where fish might be."

For more information, visit: https://gmao.gsfc.nasa.gov/

For NOAA’s El Niño and La Niña information and forecasts, visit: https://www.climate.gov/enso

Animation (mentioned), Text, Credits: NASA’s Goddard Space Flight Center/Kate Ramsayer/Karl Hille.

Greetings, Orbiter.ch

A Streamlined Form in Lethe Vallis, Mars












NASA - Mars Reconnaissance Orbiter (MRO) patch.

Sept. 13, 2016


This image shows a portion of Lethe Vallis, an outflow channel that also transported lava. The image was acquired at 15:16 local Mars time on May 6, 2016, by the High Resolution Imaging Science Experiment (HiRISE) camera on NASA's Mars Reconnaissance Orbiter. Another investigation of this area (Balme et al., 2011) discovered a repeat pattern of dune-like forms in the channel interpreted as fluvial dunes (or, giant current ripples) which are dunes formed by flowing water.

This is one of only a few places on Mars where these pristine-appearing landforms have been identified. The channel formed by catastrophic floods, during which it produced the prominent crater-cored, teardrop-shaped island in the middle. The island has the blunter end pointing upstream and the long tail pointing downstream.

Both the island and the fluvial dunes were formed by these extreme floods and their size is an indicator of the enormous discharges required to create them. The margins of the channel also show the terminal front of a pristine lava flow unit that inundated the channel from the south and the dunes show the remnants of another older lava flow. The top of the island displays polygonal patterned ground texture, which is a characteristic of periglacial processes in ice-rich ground.

The dark materials from the channel and island walls are probably dark sand being eroded from an underlying horizontal basaltic (lava) layer. The crater at the core of the island has elongated dunes and reticulate dust ridges inside. This single image thus contains features formed by periglacial, volcanic, fluvial, impact, aeolian and mass wasting processes, all in one place.

The University of Arizona, Tucson, operates HiRISE, which was built by Ball Aerospace & Technologies Corp., Boulder, Colo. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology in Pasadena, manages the Mars Reconnaissance Orbiter Project for NASA's Science Mission Directorate, Washington.

Additional image data: HiRISE, University of Arizona: http://www.uahirise.org/ESP_045833_1845

Mars Reconnaissance Orbiter (MRO): http://www.nasa.gov/mission_pages/MRO/main/index.html

Image, Text, Credits: NASA/JPL/University of Arizona/Caption: Henrik Hargitai and Ginny Gulick/Sarah Loff.

Greetings, Orbiter.ch

First physics experiment at HIE-ISOLDE begins












CERN - European Organization for Nuclear Research logo.

Sept. 13, 2016


Image above: Miniball is one of two detection stations receiving beams from HIE-ISOLDE. It’s a very efficient gamma detector array, and will be permanently linked to the beams from HIE-ISOLDE (Image: CERN).

This weekend the first physics experiment started running using radioactive beams from the newly upgraded HIE-ISOLDE facility. ISOLDE, the nuclear research facility at CERN, allows many different experiments to study the properties of atomic nuclei.

The upgrade means the machine can now reach an energy of 5.5MeV per nucleon (MeV/u.), making ISOLDE the only facility in the world capable of investigating nuclei from the middle to heavy end of this energy range.

The experiment is ready to go after the second of two cryomodules (containing the accelerating cavities)was installed – marking the end of the installation of phase one of HIE-ISOLDE.

The HIE-ISOLDE (High Intensity Energy-ISOLDE) Project is a major upgrade of the ISOLDE facility, which will increase the energy, intensity and quality of the beams delivered to scientists.

“It’s a major breakthrough. This is the result of eight years of development and manufacturing. This would not have been possible without the dedication of the technical staff at CERN. But what makes us most proud isn’t that we built a machine, but that we have attracted enthusiastic users to do forefront physics. We are looking forward to this exciting high intensity period,” says Yacine Kadi , leader of the HIE-ISOLDE project.


Image above: The tunnel at HIE-ISOLDE now contains two cryomodules – a unique set up that marks the end of phase one for the HIE-ISOLDE installation. By Spring 2018 the project will have four cryomodules installed and will be able to reach higher energy up to 10 MeV/u a broader range of nuclear physics (Image: Erwin Siesling/ CERN).

This is the second physics run of the project (the first radioactive beam was run on 22 October 2015) but then the machine only had one cryomodule and was capable of running at an energy of just 4.3MeV/u.

Now, with the second cryostat coupled on, the machine is capable of reaching up to 5.5 MeV/u and can investigate the structure of heavier isotopes.

“It is a universal machine that can accelerate and investigate all nuclei from mass number 6 to mass 224 or more and at variable energies,” explains Maria Borge, leader of the ISOLDE group. “This year we’re investigating nuclei with mass number from 9 to 142 – these experiments can only be done at this moment at ISOLDE. At CERN.”

HIE-ISOLDE will be capable of investigating nuclei of all masses when the additional two cryomodules are installed in 2018, as the machine will be able to accelerating them up to energies of 10MeV/u.

The further upgrades mean that, while ISOLDE can currently collect information about the collective properties of isotopes, eventually researchers will be able to use the machine at higher intensities to investigate the properties of individual particles. This can be done at the moment for lower masses, but has never been done before for heavier isotopes.

“The community has grown a lot recently, as people are attracted by the possibilities new higher energies bring. It’s a energy domain that’s not explored much, since no other facility in world can deliver pure beams at these energies,” Borge says.

HIE-ISOLDE will run from now until mid-November. All but one of the seven different experiments planned during this time will use the Miniball detection station. The first experiment will investigate Tin, a special element with two double magic isotopes.

HIE-ISOLDE: nuclear physics now at higher energies

Video above: Eight years since the start of the HIE-ISOLDE project, a new accelerator is in place taking nuclear physics at CERN to higher energies. The first physics run last year marked the start of the project, but after a new cryomodule was installed physicists are able to reach a greater energy of up to 5.5.MeV/u. With physicists setting their sights on even higher energies of 10 MeV/u in the future, they will continue to commission more HIE-ISOLDE accelerating cavities and beamlines in the years to come. (Video: Christoph Madsen/CERN).

Note:

CERN, the European Organization for Nuclear Research, is one of the world’s largest and most respected centres for scientific research. Its business is fundamental physics, finding out what the Universe is made of and how it works. At CERN, the world’s largest and most complex scientific instruments are used to study the basic constituents of matter — the fundamental particles. By studying what happens when these particles collide, physicists learn about the laws of Nature.

The instruments used at CERN are particle accelerators and detectors. Accelerators boost beams of particles to high energies before they are made to collide with each other or with stationary targets. Detectors observe and record the results of these collisions.

Founded in 1954, the CERN Laboratory sits astride the Franco–Swiss border near Geneva. It was one of Europe’s first joint ventures and now has 22 Member States.

Related article:

Upgraded nuclear physics facility starts up
http://orbiterchspacenews.blogspot.ch/2015/11/upgraded-nuclear-physics-facility.html

Related links:

HIE-ISOLDE facility: http://home.cern/about/experiments/isolde

For more information about European Organization for Nuclear Research (CERN), Visit: http://home.cern/

Images (mentioned), Video (mentioned), Text, Credits: CERN/Harriet Jarlett.

Greetings, Orbiter.ch

Astronomers observe star reborn in a flash












ESA - Hubble Space Telescope logo.

13 September 2016

Stingray Nebula and SAO 244567

An international team of astronomers using Hubble have been able to study stellar evolution in real time. Over a period of 30 years dramatic increases in the temperature of the star SAO 244567 have been observed. Now the star is cooling again, having been reborn into an earlier phase of stellar evolution. This makes it the first reborn star to have been observed during both the heating and cooling stages of rebirth.

Even though the Universe is constantly changing, most processes are too slow to be observed within a human lifespan. But now an international team of astronomers have observed an exception to this rule. “SAO 244567 is one of the rare examples of a star that allows us to witness stellar evolution in real time”, explains Nicole Reindl from the University of Leicester, UK, lead author of the study. “Over only twenty years the star has doubled its temperature and it was possible to watch the star ionising its previously ejected envelope, which is now known as the Stingray Nebula.”

SAO 244567, 2700 light-years from Earth, is the central star of the Stingray Nebula and has been visibly evolving between observations made over the last 45 years. Between 1971 and 2002 the surface temperature of the star skyrocketed by almost 40 000 degrees Celsius. Now new observations made with the Cosmic Origins Spectrograph (COS) on the NASA/ESA Hubble Space Telescope have revealed that SAO 244567 has started to cool and expand.

SAO 244567

This is unusual, though not unheard-of [1], and the rapid heating could easily be explained if one assumed that SAO 244567 had an initial mass of 3 to 4 times the mass of the Sun. However, the data show that SAO 244567 must have had an original mass similar to that of our Sun. Such low-mass stars usually evolve on much longer timescales, so the rapid heating has been a mystery for decades.

Back in 2014 Reindl and her team proposed a theory that resolved the issue of both SAO 244567’s rapid increase in temperature as well as the low mass of the star. They suggested that the heating was due to what is known as a helium-shell flash event: a brief ignition of helium outside the stellar core [2].

This theory has very clear implications for SAO 244567’s future: if it has indeed experienced such a flash, then this would force the central star to begin to expand and cool again — it would return back to the previous phase of its evolution. This is exactly what the new observations confirmed. As Reindl explains: “The release of nuclear energy by the flash forces the already very compact star to expand back to giant dimensions — the born-again scenario.”

Evolution of SAO 244567

It is not the only example of such a star, but it is the first time ever that a star has been observed during both the heating and cooling stages of such a transformation.

Yet no current stellar evolutionary models can fully explain SAO 244567’s behaviour. As Reindl elaborates: “We need refined calculations to explain some still mysterious details in the behaviour of SAO 244567. These could not only help us to better understand the star itself but could also provide a deeper insight in the evolution of central stars of planetary nebulae.”

Until astronomers develop more refined models for the life cycles of stars, aspects of SAO 244567’s evolution will remain a mystery.

Notes:

[1] The other star thought to have experienced the same type of helium flash event (see
[2]) is FG Sagittae, located in the constellation Sagitta, making SAO 244567 the second of its kind. However, other objects undergoing similar “born-again” scenarios are known, including Sakurai’s Object, located in Sagittarius.

[2] Helium flash events, also known as late thermal pulses, occur late in the evolution of about 25% of low- to medium-mass stars. After evolving off the main sequence, these stars enter the red giant phase, where the star expands dramatically. Various changes occur in the star’s chemical and physical composition during this phase, until it has burnt most of the helium available in its core, which is by then composed of carbon and oxygen. Helium fusion continues in a thin shell around the core, but then turns off as the helium becomes depleted. This allows hydrogen fusion to start in a layer above the helium layer. After enough additional helium accumulates, helium fusion is reignited, leading to a thermal pulse which eventually causes the star to expand, cool and brighten temporarily.

More information:

The Hubble Space Telescope is a project of international cooperation between ESA and NASA.

The results will be presented in the paper “Breaking news from the HST: The central star of the Stingray Nebula is now returning towards the AGB”, published in the Monthly Notices of the Royal Astronomical Society (MNRAS).

The international team of astronomers in this study consists of Nicole Reindl (University of Leicester, UK; Eberhard Karls University, Germany), T. Rauch (Eberhard Karls University, Germany), M. M. Miller Bertolami (UNLP-CONICET, Argentina), H. Todt (University of Potsdam, Germany), K. Werner (Eberhard Karls University, Germany)

Links:

Images of Hubble: http://www.spacetelescope.org/images/archive/category/spacecraft/

Link to science paper: http://www.spacetelescope.org/static/archives/releases/science_papers/heic1618/heic1618a.pdf

For more information about the Hubble Space Telescope, visit:

http://hubblesite.org/
http://www.nasa.gov/hubble
https://www.spacetelescope.org/

Images, Text, Credits: ESA/Hubble & NASA/Video: ESA/Hubble, L. Calçada.

Best regards, Orbiter.ch