Space News & Blog Articles

Tune into the SpaceZE News Network to stay updated on industry news from around the world.

The Streetlights in an Entire County Were Swapped to LEDs. Light Pollution got Worse

“The best laid plans of mice and men often go awry” – this famous paraphrase of Scottish poet Robert Burns sometimes sums up human ingenuity.  That is exactly what happened when a county in Washington State decided to replace all of its county-owned streetlights with LEDs at least partially in an effort to combat light pollution.  New research shows that they actually made the light pollution worse.

Dr. Li-Wei Hung and her colleagues at the National Park Service recently released a paper currently available on arXiv that details work that they did to monitor the night sky both before and after Chelan County replaced their streetlights with LEDs.

Map of Chelan County and where its street lights are located.
Credit – Hung et al.

Chelan County is located in the north-central part of the state and serves as a gateway to several outdoor recreational areas nearby, including North Cascades National Park.  Given this interest in the outdoors, less light pollution would seem like a benefit to stargazing hoping to catch a glimpse of the Milky Way.

So the county decided to replace all 3,693 of the county-owned streetlights (60% of the total outdoor streetlights in Chelan County) with “full cutoff” light emitting diodes for bulbs.  About 80% of these new LEDs were “3000K” or “warm white light”, while the other 20% were slightly brighter “4000K” bulbs that were installed to meet lighting requirements set by the Washington State Department of Transportation.

Continue reading
  273 Hits

Multiple Earth-Mass Rogue Planets Have Been Discovered Drifting Through the Milky Way

Last year we reported on how the Roman Space Telescope’s backers hoped it would be able to detect rogue planets using a technique called “microlensing”.  Now, a team led by Iain McDonald, then at the University of Manchester, beat them to the punch by finding a few examples of Earth-sized rogue planets using data from an already aging space telescope – Kepler.

Both collecting and analyzing the data used in the study wasn’t easy though.  Kepler embarked on a two-month campaign in 2016 that had it looking at millions of stars located near the center of the Milky Way every 30 minutes.  Even with that much data, picking the signal from the noise was difficult.  

UT video explaining gravitational lensing, which microlensing is a smaller example of.

They are difficult because microlensing is exhibited by tiny fluctuations in the light of stars when an object passes in front of them.  According to Dr. McDonald, about every one in a million stars in the galaxy is subject to microlensing at any point in time.  So of the million of stars towards the center of the Milky Way, several could be undergoing microlensing right now.

Those events can last anywhere from minutes to days, as it depends on the difference between the foreground object and background stars, as well as the weight of the foreground object.  Of the many microlensing events that take place facing the galactic core, only approximately 1% of them are caused by rogue planets, and the signals from those events are much smaller when compared to microlenses caused by foreground stars.

Video showing what a microlensing event looks like from Earth.
Credit – David Specht / Eamonn Kerins / University of Manchester

Despite all the difficulties in collecting data with an old telescope, siphoning through all the additional data and background noise, and trying to differentiate between events caused by stars and those caused by planets, Dr. McDonald and his co-author, Eamonn Kerins were able to find 27 candidates for microlensing events. Of those, four could have potentially been caused by Earth-sized rogue planets.  

Continue reading
  319 Hits

Cassini Saw Methane in Enceladus’ Plumes. Scientists Don’t Know How it Could be There Without Life

Even though the Cassini mission at Saturn ended nearly four years ago, data from the spacecraft still keeps scientists busy. And the latest research using Cassini’s wealth of data might be the most enticing yet.

Researchers say they’ve detected methane in the plumes of Saturn’s icy moon Enceladus. The process for how the methane is produced is not known at this time, but the study suggests that the surprisingly large amount of methane found are likely coming from activity at hydrothermal vents present on Enceladus’s interior seafloor. These vents could be very similar those found in Earth’s oceans, where microorganisms live, feed on the energy from the vents and produce methane in a process called methanogenesis.

“We are not concluding that life exists in Enceladus’ ocean,” said Régis Ferrière, an associate professor at the University of Arizona, and one of the study’s two lead authors.  “Rather, we wanted to understand how likely it would be that Enceladus’ hydrothermal vents could be habitable to Earthlike microorganisms. Very likely, the Cassini data tell us, according to our models.”

One of the biggest surprises of the 13-year Cassini mission came in Enceladus, a tiny moon with active geysers at its south pole. At only about 310 miles (500 km) in diameter, the bright and ice-covered Enceladus should be too small and too far from the Sun to be active. Instead, this little moon is one of the most geologically dynamic objects in the Solar System.

In 2005 Cassini discovered jets of water vapor and ice erupting form the surface of Enceladus. The water could be from an subsurface sea. Image Credit: Cassini Imaging Team, SSI, JPL, ESA, NASA

Stunning backlit images of the moon from Cassini’s camera show plumes erupting in Yellowstone-like geysers, emanating from tiger-stripe-shaped fractures in the moon’s surface. The discovery of the geysers took on more importance when Cassini later determined the plumes contained water ice and organics. Since life as we know it relies on water, this small but energetic moon has been added to the short list of possible places for life in our Solar System. 


Continue reading
  237 Hits

Tales of a ‘Drunken Comet’- Astronomers Detect Alcohol Leaking From 46P/Wirtanen into Space

A close pass of Comet Wirtanen in 2018 offered researchers an unprecedented opportunity.

Comets are full of surprises. Not only do they often under- or very occasionally over- perform versus expectations, but they also offer a glimpse of the remnants of the very early solar system. In December 2018, astronomers had an unprecedented opportunity to study one of these relics of the early solar system up close as Comet 46P/Wirtanen sped by Earth just 30 times the Earth-Moon distance (7.1 million miles away) on its closest passage for this century.

The orbit of Comet 46P/Wirtanen. NASA/JPL

Discovered by astronomer Carl A. Wirtanen in 1948, short period Comet 46P Wirtanen orbits the Sun every 5.4 years, on a path that takes it from a perihelion 1.06 AU from the Sun to an aphelion of 5.13 AU, just outside the perihelion of Jupiter.

The 2018 approach past Earth for the comet was an especially favorable one, and this time, astronomers at the W.M. Keck Observatory on Maunakea, Hawai’i were ready. Keck’s Near Infrared Spectrograph (NIRSPEC) just received a major upgrade, featuring more pixels and higher sensitivity, an upgrade that would see first light obtaining spectra of the comet.

Instruments need hugs, too. Dr. Emily Martin with the newly upgraded NIRSPEC instrument. W.M. Keck Observatory.

And the results, recently published in The Planetary Science Journal were a spectacular success. Not only did the team classify a list of key compounds seen out-gassing from Comet Wirtanen, but they discovered a high alcohol ratio for the comet, along with an anomalous heating mechanism at play.





Continue reading
  280 Hits

Researchers Have Taught a Drone to Recognize and Hunt Down Meteorites Autonomously

Planetary scientists estimate that each year, about 500 meteorites survive the fiery trip through Earth’s atmosphere and fall to our planet’s surface. Most are quite small, and less than 2% of them are ever recovered. While the majority of rocks from space may not be recoverable due to ending up in oceans or remote, inaccessible areas, other meteorite falls are just not witnessed or known about.

But new technology has upped the number known falls in recent years. Doppler radar has detected meteorite falls, as well as all-sky camera networks specifically on the lookout for meteors. Additionally, increased use of dashcams and security cameras have allowed for more serendipitous sightings and data on fireballs and potential meteorite falls.

A team of researchers is now taking advantage of additional technology advances by testing out drones and machine learning for automated searches for small meteorites.  The drones are programmed to fly a grid search pattern in a projected ‘strewn field’ for a recent meteorite fall, taking systematic pictures of the ground over a large survey area. Artificial intelligence is then used to search through the pictures to identify potential meteorites.  

“Those images can be analyzed using a machine learning classifier to identify meteorites in the field among many other features,” said Robert Citron of the University of California, Davis, in a recent paper published in published in Meteoritics & Planetary Science.

Citron and his colleagues have tested their conceptual drone setup several times, mostly recently in the area of a known 2019 meteorite fall near Walker Lake, Nevada. Their proof-of-concept meteorite classifier deploys a combination of “different convolution neural networks to recognize meteorites from images taken by drones in the field,” the team writes.



Continue reading
  277 Hits

After Just 6 Weeks of Construction, Super Heavy is Built and Ready to Move

As usual, the SpaceX South Texas Launch Facility, located near the village of Boca Chica, is the focal point of a lot of attention. Almost two months ago, crews at the facility began working on the first true Super Heavy prototype, the launch stage of SpaceX’s Starship. After six weeks of assembly, SpaceX rolled the Super Heavy Booster 3 (B3) out of the “High Bay” (where it was assembled) and installed it onto the launch pad.

The assembly process began on May 15th, which was assisted by the new Bridge Crane (added to the High Bay back in March) and wrapped up on Thursday, July 1st. The B3 was then moved out and loaded aboard the companies Self-Propelled Modular Transporter (SPMT) and transported down Highway 4 to the launch facility, where it was transferred by another crane onto Test Pad A.

Once it is ready to conduct commercial missions, the Starship and Super Heavy will be the world’s first entirely reusable launch system. As the booster element (aka. first stage) of the system, the Super Heavy stands about 65 meters (215 ft) tall and will be equipped with 32 Raptor engines. This record number of engines (more than any rocket in history) will allow the Super Heavy to produce 72 meganewtons (MN), or 16 million pounds/thrust (lbf).

This is more than twice the thrust generated by the first stage of the Saturn V booster, which NASA used to send the Apollo astronauts to the Moon – 35.1 MN, or 7.89 million lbf. When paired with the Starship – the orbital vehicle element that will rely on 6 Raptors engines – the launch system will be capable of sending 100 metric tons (110 US tons) to Low Earth Orbit (LEO).

According to a statement made by Musk via Twitter, the B3 prototype will be used for ground tests, similar to the ground tests conducted with the Starship (SN) prototypes. This differentiates it from Booster 1 (BN1), the first Super Heavy prototype to complete stacking inside the High Bay, which served as a

Continue reading
  208 Hits

The "Crisis in Cosmology" Might not be a Crisis After all

The standard model of cosmology is known as the LCDM model. Here, CDM stands for Cold Dark Matter, which makes up most of the matter in the universe, and L stands for Lambda, which is the symbol used in general relativity to represent dark energy or cosmic expansion. While the observational evidence we have largely supports the LCDM model, there are some issues with it. One of the most bothersome is known as cosmic tension.

It centers on our measurement of the Hubble constant, which tells us the rate at which the universe has expanded over time. There are lots of ways to measure the Hubble constant, from the brightness of distant supernovae, to the clustering of galaxies, to fluctuations in the cosmic background, to the light of microwave lasers. All of these methods have advantages and disadvantages, but if our cosmological model is right they should all agree within the limits of uncertainty.

Measured Hubble values don’t agree. Credit: Wendy Freedman

The problem is, they don’t agree. Back in the early days of cosmology the uncertainty of our measurement was so large that all these results overlapped, but as our measurements got better it became clear different methods gave slightly different values for the Hubble constant. In polite company, astronomers say there is tension between these values.

This tension means that either our measurements are a bit off, or there is something wrong with our model. This has led some astronomers to propose some missing aspects to our model, such as how the mass of neutrinos might realign our Hubble values. But as new measurements of the Hubble constant keep coming in, it looks as if the tension is just getting worse. Now a new paper from Wendy Freedman argues that the tension problem isn’t that bad and that the tension will likely fade as the next generation of telescopes gives us even better data.

As it stands, the main tension in Hubble values arises between methods that rely upon the cosmic distance ladder, such as supernova observations, and those that don’t, such as the cosmic microwave background (CMB).


Continue reading
  230 Hits

Satellites can Track Microplastics From Space

Sometimes simple and elegant solutions are all that is needed to solve a problem.  One problem that was searching for a solution was how to track microplastics.  These small particles of plastics are what results after the sun and friction (such as ocean waves) break down larger plastic objects.  They have become a huge problem in the ocean, wreaking havoc on ecosystems and their constituent organisms.  Now, a team from the University of Michigan have used data originally collected to monitor hurricanes to try to track microplastics, potentially helping to reign in a problem that threatens to engulf the world’s oceans.

The data the team used was collected by NASA’s Cyclone Global Navigation Satellite System (CYGNSS).  CYGNSS is a constellation of 8 microsatellites that launched in 2016 and normally monitors weather and ocean patterns to keep track of hurricanes.  Specifically they were interested in data collected on ocean roughness – or how choppy the ocean is.  Though placid oceans are not caused by only a single factor, one factor that contributes to it is the amount of debris in the ocean at a given location.

Heat map from the microplastics tracking paper that shows concentrations of microplastics / low ocean roughness.
Credit – University of Michigan Engineering YouTube Channel

Much of that debris is made up of microplastics.  So the researchers theorized that calmer water would result from high concentrations of microplastics.  To find the calming effect of those microplastics though, they first had to control for another factor impacting choppiness – wind speed.  Luckily, CYGNSS also has data on wind speeds at the same locations it collected data on ocean roughness.  

With proper controls in place, the researchers then compared areas of calm seas with the areas of concentrated microplastics, as predicted by models. They matched up particularly well, lending credence to the idea that microplastics could be tracked via remote sensing of ocean roughness.

Example of microplastics captured in the Atlantic Ocean.
Credit – Nichole Trenholm / Ocean Research Project

Continue reading
  249 Hits

One of the Brightest Star-Forming Regions in the Milky Way, Seen in Infrared

Certain parts of the galaxy are more magical than others.  There are barren wastelands where barely a particle strays through occasionally, and there are fantastical nebulae that can literally light up the sky.  But beyond their good looks, those nebulae hold secrets to understanding some of the most important features of any galaxy – stars. Now, for the first time, a team from the University of Maryland managed to capture a high resolution image of one of the most active star-forming regions in our part of the galaxy.  Data from that image are not only spectacular, but can illuminate the details of the star formation process.

The instrument the team used to collect the most important parts of the data, known as SOFIA, is an amazing air-based telescopes.  Strapped to a modified 747 chassis, it specializes in capturing infrared images, just slightly out of the wavelengths that the human eye can see.  SOFIA turned its eye on a star forming region known as Westerlund 2, which is located in the RCW 49 nebula.  But the researchers didn’t stop there, using data collected in every wavelength from x-rays down to radio waves via different instruments.

Hubble image of a starforming nebula with the 747 housing SOFIA in the foreground.
Credit – Marc Pound / UMD

Luckily there was plenty of data to choose from.  That region of space had been the focus of previous studies, which hinted that there might be two bubbles of warm gas surrounding the region.  These types of gas bubbles have long been thought to play a role in star formation. Data from SOFIA definitively showed that there was in fact only one bubble, and that bubble is expanding.  The most likely cause of that expansion is a stellar wind launched by the formation of a massive star somewhere within the bubble itself.  

Gas bubbles aren’t the only material surrounding these formation regions though.  They’re joined by a “shell” made up of a form of ionized carbon.  Seen in the kaleidoscope of wavelengths the researchers analyzed, the shell and the gas bubbles intermingle with each other, but separating out individual wavelengths allowed for much higher resolution pictures of the bubbles (which were invisible in previous radio and sub-millimeter data) and shell (which glowed in a far-infrared band that SOFIA was able to collect).  

Hubble view of the huge star formation region N11 in the Large Magellanic Cloud.
Continue reading
  229 Hits

The Square Kilometer Array has Gotten the Official Green Light to Begin Construction

In Australia and South Africa, there are a series of radio telescopes that will be soon joined by a number of newly-constructed facilities to form the Square Kilometer Array (SKA). Once established, the SKA will have a collecting area that measures a million square meters (close to 2 million square yards). It will also be 50 times more sensitive than any radio telescope currently in operation, and be able to conduct surveys ten thousand times faster.

During a historic meeting that took place on June 29th, 2021, the member states that make up the SKAO Council voted to commence construction. By the late 2020s, when it’s expected to gather its first light, the array will consist of thousands of dishes and up to a million low-frequency antennas. These will enable it to conduct all kinds of scientific operations, from scanning the earliest periods in the Universe to searching for extraterrestrial intelligence (SETI).

At its core, the SKA relies on a process known as interferometry, where light from cosmic sources is gathered by multiple telescopes and then combined to create high-resolution images. For radio telescopes, this technique has the added advantage of allowing for observations where only a subset of the full array is available. With such a large collecting area, the SKA will allow for all kinds of revolutionary science.

A Huge Effort

The SKA consists of four “precursor facilities,” which include the MeerKAT and the Hydrogen Epoch of Reionization Array (HERA) in South Africa, and the Australian SKA Pathfinder (ASKAP) and Murchison Widefield Array (MWA) in Australia. Beyond these, there are also the “pathfinder” facilities located outside of these two countries, consisting of the Allen Telescope Array in northern California and the Low-Frequency Array (LOFAR) in the Netherlands.

These facilities are divided into two networks designated SKA-Low and SKA-Mid, which describe the radio frequency range they will cover. The decision to approve construction comes on the heels of two major developmental milestones for the SKAO. First, there was the publication of two key documents last year, the Observatory’s Construction Proposal and Observatory Establishment and Delivery Plan, and an executive summary of both.

Continue reading
  323 Hits

Potentially More Subsurface Lakes Found on Mars

One of the hardest things to reconcile in science is when new data either complicates or refutes previously findings.  It’s even more difficult when those findings were widely publicized and heralded around the community.  But that is how science works – the theories must fit the data.  So when a team from JPL analyzed data from Mars Express about the Martian South Pole, they realized the findings announced in 2018 about subsurface lakes on Mars might have been more fraught than they had originally thought.

That original discovery was announced after scientists found particularly bright spots in radar signals under the surface that were interpreted as being from liquid water.  Located in the region called the “South Polar Layered Deposits”, layers of water, dry ice, and dust have been intermixed over millions of years as Mars’ axial tilt changed. In lower layers, temperatures were high enough that sufficiently salty water could potentially be liquid.

UT video discussing the possibility of life (and water) on Mars.

When looking over data from the entirety of the Martian south pole, the JPL scientists noticed the same highly reflective surface in dozens of additional places under the surface.   Some appeared to be within 1.6 km (1 mile) of the surface.  Unfortunately, that also means the temperature would be a chilly -63 C (-81 F). Even with a massive amount of perchlorates (a special kind of salt prevalent on Mars), water would still be frozen at those temperatures.

First, the investigators, Jeffrey Plaut and Aditya Khuller from JPL (Khuller is now at ASU), tried to think of other potential heat sources that could increase the temperature in the areas they saw the highly reflective features.  An obvious candidate would be volcanism, which is potentially responsible for undersea oceans on other worlds in the solar system.  However, there is no other evidence of active volcanism at the south pole, so the researchers ruled it out as a heat source.

Visualization from the original 2018 study showing the reflected radar signals that were interpreted as lakes.
Credit –  Context map: NASA/Viking; THEMIS background: NASA/JPL-Caltech/Arizona State University; MARSIS data: ESA/NASA/JPL/ASI/Univ. Rome; R. Orosei et al 2018

Dots on this map of the Martian south pole show where radar reflections were noted by the MARSIS instrument the JPL scientists used.
Continue reading
  200 Hits

NASA is Testing out new Composite Materials for Building Lightweight Solar Sail Supports

Space exploration is driven by technology – sometimes literally in the case of propulsion technologies.  Solar sails are one of those propulsion technologies that has been getting a lot of attention lately.  They have some obvious advantages, such as not requiring fuel, and their ability to last almost indefinitely.  But they have some disadvantages too, not the least of which is how difficult they are to deploy in space.  Now, a team from NASA’s Langley Research Center has developed a novel time of composite boom that they believe can help solve that weakness of solar sails, and they have a technology demonstration mission coming up next year to prove it.

The mission, known as the “Advanced Composite Solar Sail System” (ACS3) mission is designed around a 12U CubeSat, which measures in at a tiny 23cm x 23 cm x 34 cm (9 in x 9 in x 13 in). The solar sail it hopes to deploy will come in at almost 200 square meters (527 sq ft), and both it and its composite booms will fit inside the CubeSat enclosure, which is not much larger than a toaster oven.

The booms themselves are made out of a novel composite that is 75% lighter than previous deployable booms, while also suffering from only 1% of the thermal distortion that previous metallic booms were subjected to.  They also conveniently roll into a 18 cm (7”) diameter spool that can be easily stored and easily deployed once the CubeSat is in space.

Its deployment mechanism still requires power, however, so the ACS3 mission will use a small solar panel to collect enough power to enable that deployment. But once it is fully unfurled, the mission will switch to a technology demonstration of actually adjusting the CubeSat’s orbit using only solar radiation pressure – the driving force of solar sails.

Video from Langley explaining the development of the composite booms and how they will be used on the ACS3 mission.
Credit – NASA Langley Research Center YouTube Channel

Solar sails themselves are only as effective as their size allows them to be – larger sizes means more radiation pressure and faster acceleration.  Therefore, the team behind the composite booms are also developing a larger boom system that would allow them to deploy solar sails that will come in at a whopping half an acre (2,000 sq meters).  Its spools would need to be slightly longer, but the cost to benefit ratio is huge.

Continue reading
  226 Hits

Astronomers see an Accretion Disk Where Planets are About to Form

Planet formation is notoriously difficult to study.  Not only does the process take millions of years, making it impossible to observe in real time, there are myriad factors that play into it, making it difficult to distinguish cause and effect.  What we do know is that planets form from features known as protoplanetary disks, which are made up of gas and dust surrounding young stars.  And now a team using ALMA have found a star system that has a protoplanetary disk and enough variability to help them nail down some details of how exactly the process of planet formation works.

The research is described in two new papers in The Astrophysical Journal.  They describe the star system Elias 2-27, which is located about 400 light years from Earth in Ophiuchus, the Serpent Bearer.  It has attracted the attention of astronomers for the last 5 years, first being studied in 2016 when it revealed a pinwheel of dust surrounding the star.

Visualization from NASA of planets forming in a protoplanetary disk.
Credit – NASA

Usually protoplanetary disks don’t take the shape of a pinwheel, which is more commonly found in galactic formations such as the Pinwheel Galaxy.  Researchers speculated that the two pinwheel arms visible around the star were caused by gravitational instabilities, which could also contribute to planetary formation processes.  But they needed further data to prove their idea.

That is where the new papers come in.  Data that was collected over the last 5 years proved the existence of gravitational instabilities, but also found a few things that weren’t caught in the first round of data.  It appears there may have been more material accreting to the disk itself, causing more gravitational chaos. More surprisingly, some parts of the protoplanetary disk were much taller than others.

Traces of dynamic gas patterns in the Elias 2-27 system.
Credit – ALMA (ESO / NAOJ / NRAO) / T. Penque-Carreño (Universidad de Chile), B. Saxton (NRAO)

This type of “vertical asymmetry” had never been observed before in a protoplanetary disk, and allowed the researchers to take a step forward in one of the computational hurdles that block the path to fully understanding planetary formation.  Computational members of the team had predicted that gravitational instabilities might cause the huge pillars of matter that appear to tower over the disk.  Those towers also open up the possibility of calculating the actual quantity of material present in the disk itself – a measurement that has eluded planetary scientists so far.  

Continue reading
  234 Hits

Hawking Made a Prediction About Black Holes, and Physicists Just Confirmed it

On its own, a black hole is remarkably easy to describe. The only observable properties a black hole has are its mass, its electric charge (usually zero), and its rotation, or spin. It doesn’t matter how a black hole forms. In the end, all black holes have the same general structure. Which is odd when you think about it. Throw enough iron and rock together and you get a planet. Throw together hydrogen and helium, and you can make a star. But you could throw together grass cuttings, bubble gum, and old Harry Potter books, and you would get the same kind of black hole that you’d get if you just used pure hydrogen.

This strange behavior of black holes is known as the no hair theorem, and it relates to what’s known as the information paradox. In short, since everything in the universe can be described by a certain amount of information, and objects can’t just disappear, the total amount of information in the universe should be constant. But if you toss a chair into a black hole, it just adds to the black hole’s mass and spin. All the information about the color of the chair, whether it’s made of wood or steel, and whether it’s tall or short is lost. So where did that information go?

A black hole seems to strip information from objects. Credit: [email protected] — Gravitation @ Aveiro University

One solution to this information paradox could be possible thanks to Stephen Hawking. Back in 1974, he demonstrated that the event horizon of a black hole might not be absolute. Because of quantum indeterminacy, black holes should emit a tiny amount of light now known as Hawking radiation. Hawking radiation has never been observed, but if it exists the information lost when objects enter a black hole might be carried out of the black hole via this light. Thus the information isn’t truly lost.

If Hawking radiation is real, that also means that black holes follow the laws of thermodynamics. It’s an idea first proposed by Jacob Bekenstein. If black holes emit light, then they have to have a thermal temperature. Starting from Bekenstein’s idea, several physicists have shown that there is a set of laws for black holes known as black hole thermodynamics.

Since you’re reading this article, you’re probably familiar with the second law of thermodynamics, which states that the entropy of any system must increase. This is the reason that a cup of hot coffee cools down over time, slightly heating the room until the coffee and the room are all the same temperature. You never see a cold cup of coffee spontaneously heat up while slightly cooling the room. Another way to state the second law is that heat flows from a hot object to surrounding cooler objects.


Continue reading
  246 Hits

Unfortunately, There are Other Viable Explanations for the Subsurface Lakes on Mars

Ever since 1971, when the Mariner 9 probe surveyed the surface of Mars, scientists have theorized that there might be subsurface ice beneath the southern polar ice cap on Mars. In 2004, the ESA’s Mars Express orbiter further confirmed this theory when its Mars Advanced Radar for Subsurface and Ionosphere Sounding (MARSIS) instrument detected what looked like water ice at a depth of 3.7 km (2.3 mi) beneath the surface.

These findings were very encouraging since they indicated that there could still be sources of liquid water on Mars where life could survive. Unfortunately, after reviewing the MARSIS data, a team of researchers led from Arizona State University (ASU) has proposed an alternative explanation. As they indicated in a recent study, the radar reflections could be the result of clays, metal-bearing minerals, or saline ice beneath the surface.

The study, which recently appeared in the Geophysical Research Letters, was led by Carver J. Bierson – a postdoctoral researcher at ASU’s School of Earth and Space Exploration (SESE). He was joined by Earth and Planetary Sciences Professor Slawek Tulaczyk of UC Santa Cruz (UCSC), ASU research associate Samuel Courville, and Nathaniel Putzig – a senior scientist with the Planetary Science Institute (PSI).

A view of the southern polar plain of Mars with the area scanned by MARSIS highlighted. Credit: USGS Astrogeology Science Center/ASU/ INAF

The MARSIS instrument works by directing a ground-penetrating radar beam towards the surface of Mars, then measuring the reflected echo. An underground zone of liquid water will have very different electrical properties from surrounding ice or rocks and will reflect very strongly. This technique allowed the Mars Express to create a subsurface map of Mars up to depths of 5 km (3 mi).

Back in 2018, an analysis of the subsurface radar reflections by a team of Italian researchers focused primarily on electrical permittivity, which controls the speed of radio waves within a material. The denser the material in question (water, ice, rock, etc.), the slower the waves will travel, and the power of the reflected waves will be affected as well. Due to its brightness, this bright radar reflection was interpreted as a large patch of liquid, briny water.



Continue reading
  272 Hits

Bad News, Life Probably can’t Exist on Venus. Good News, it Could be in Jupiter’s Clouds

For decades, scientists engaged in the search for life in the Universe (aka. astrobiology) have focused on searching for life on other Earth-like planets. These included terrestrial (aka. rocky) planets beyond our Solar System (extrasolar planets) and ones here at home. Beyond Earth, Mars is considered to be the most habitable planet next to Earth, and scientists have also theorized that life could exist (in microbial form) in the cloud tops of Venus.

In all cases, a major focal point is whether or not planets have large bodies of water on their surfaces (or did in the past). However, a new study led by a research team from the UK and German (with support from NASA) has shown that the existence of life may have less to do with the quantity of water and more to with the presence of atmospheric water molecules. As a result, we may have better luck finding life on Jupiter’s turbulent cloud deck than Venus’.

The study that describes their findings, which was recently published in Nature Astronomy under the title “Water activity in Venus’s uninhabitable clouds and other planetary atmospheres,” was led by Dr. John E. Hallsworth of the School of Biological Sciences at Queen’s University Belfast. He was joined by colleagues from multiple universities in the UK and Germany, and the NASA Ames Research Center’s Space Science Division (SSD).

This artistic impression depicts Venus. Astronomers at MIT, Cardiff University, and elsewhere may have observed signs of life in the atmosphere of Venus. Credits: ESO (European Space Organization)/M. Kornmesser & NASA/JPL/Caltech

Venus has been the focal point of a lot of interest lately, ever since the announcement that phosphine gas had been detected in the planet’s dense atmosphere. These findings, according to a team of independent researchers, was a possible sign that microbial life might exist in Venus’ sulfuric acid clouds (aka. a potential biosignature). However, according to this latest study, Venus’ atmosphere doesn’t have enough water activity to support this claim.

This conclusion is based on a new method devised by Hallsworth and his colleagues to determine the leve of water activity in a planet’s atmospheres. They then applied this method on Venus’ atmosphere, where temperatures range from 30 to 80 °C (86 to 176 °F) at altitudes of 50 km (30 mi) above the surface and water vapor accounts for about 0.002% of the atmosphere by volume.



Continue reading
  233 Hits

The Center of the Milky Way is the Most Likely Place to Find a Galactic Civilization

Aim for the Center

The Milky Way is 13 BILLION years old. Some of our Galaxy’s oldest stars were born near the beginning of the Universe itself. During all these eons of time, we know at least one technological civilization has been born – US!

But if the Galaxy is so ancient, and we know it can create life, why haven’t we heard from anybody else? If another civilization was just 0.1% of the Galaxy’s age older than we are, they would be millions of years further along than us and presumably more advanced. If we are already on the cusp of sending life to other worlds, shouldn’t the Milky Way be teeming with alien ships and colonies by now?

Maybe. But it’s also possible that we’ve been looking in the wrong place. Recent computer simulations by Jason T. Wright et al suggest that the best place to look for ancient space-faring civilizations might be the core of the Galaxy, a relatively unexplored target in the search for extra terrestrial intelligence.

Animation showing the settlement of the galaxy. White points are unsettled stars, magenta spheres are settled stars, and white cubes represent a settlement ship in transit. The spiral structure formed is due to galactic shear as the settlement wave expands. Once the Galaxy’s center is reached, the rate of colonization increases dramatically. Credit: Wright et al

The Churn

Older mathematical models of space colonization have tried to determine the time required for a civilization to spread throughout the Milky Way. Given the size of the Milky Way, wide-scale galactic colonization could take longer than the age of the Galaxy itself. However, a unique feature of this new simulation is its accounting for the motion of the Galaxy’s stars. The Milky Way is not static, as assumed in prior models, rather it is a churning swirling mass. Colonization vessels or probes would be flying among stars that are themselves in motion. The new simulation reveals that stellar motion aids in colonization contributing a diffusing effect to the spread of a civilization.

The simulation is based previous research by Jonathan Carroll-Nellenback et al which proposed that a hypothetical civilization could spread at sub-light speeds through a moving Galaxy. The simulation assumes a civilization using ships travelling at velocities comparable to our own spacecraft (about 30km/s). When a ship arrives at a virtual habitable world in the simulation, the world is considered a colony and can itself launch another craft every 100,000 years if another uninhabited world is in range. Simulated space craft range is 10 light years with maximum travel duration of 300,000 years. Technology from a virtual colony was set to last 100 million years before dying out with the opportunity to be resettled should another colony drift into range by galactic motion.



Matthew Cimone
Continue reading
  270 Hits

NASA Continues to Try and Rescue Failing Hubble

Things are not looking very good for the Hubble Space Telescope right now. On Sunday, June 13th, the telescope’s payload computer suddenly stopped working, prompting the main computer to put the telescope into safe mode. While the telescope itself and its science instruments remain in working order, science operations have been suspended until the operations team can figure out how to get the payload computer back online.

While attempting to restart the computer, the operations team has also tried to trace the issue to specific components in the payload computer and switch to their backup modules. As of June 30th, the team began looking into the Command Unit/Science Data Formatter (CU/SDF) and the Power Control Unit (PCU). Meanwhile, NASA is busy preparing and testing procedures to switch to backup hardware if either of these components are the culprit.

The payload computer is part of the Science Instrument Command and Data Handling (SI C&DH) unit, where it is responsible for controlling and coordinating the scientific instruments aboard the spacecraft. The current issues began when the main computer stopped receiving the “keep-alive” signal from the payload computer – which lets the main computer know that everything is working.

The Hubble Space Telescope being released from the cargo bay of the Space Shuttle Discovery in 1990. Credit: NASA

That’s when the operations team began investigating different pieces of hardware on the SI C&DH as the possible source. Based on the available data, the team initially thought that the problem was due to a degrading memory module and tried to switch to one of the module’s multiple backups – but met with failure. On the evening of Thursday, June 17th, another attempt was made to bring both modules back online, but these attempts also led to failure.

At that point, they began looking into other possibles sources of the shutdown, like the Standard Interface (STINT) hardware. This component is responsible for bridging communications between the computer’s Central Processing Module (CPM), which they began investigating as well. Now, the team is investigating the Command Unit/Science Data Formatter (CU/SDF) and a power regulator within the Power Control Unit (PCU).


Continue reading
  265 Hits

A Small Satellite With a Solar Sail Could Catch up With an Interstellar Object

When Oumuamua, the first interstellar object ever observed passing through the Solar System, was discovered in 2017, it exhibited some unexpected properties that left astronomers scratching their heads. Its elongated shape, lack of a coma, and the fact that it changed its trajectory were all surprising, leading to several competing theories about its origin: was it a hydrogen iceberg exhibiting outgassing, or maybe an extraterrestrial solar sail (sorry folks, not likely) on a deep-space journey? We may never know the answer, because Oumuamua was moving too fast, and was observed too late, to get a good look.

It may be too late for Oumuamua, but we could be ready for the next strange interstellar visitor if we wanted to. A spacecraft could be designed and built to catch such an object at a moment’s notice. The idea of an interstellar interceptor like this has been floated by various experts, and funding to study such a concept has even been granted through NASA’s Innovative Advanced Concepts (NIAC) program. But how exactly would such an interceptor work?

A new paper released on ArXiv on June 27th explores one possible mission design. Derived from the NIAC study, the proposal suggests combining solar sail technology with the ability to miniaturize space probes to small, lightweight sizes.

Missions like JAXA’s IKAROS probe to Venus and the Planetary Society’s ongoing LightSail 2 project in Earth orbit have shown that solar sails, which use photons from the sun to accelerate, are entirely feasible propulsion systems. Similarly, the successful use of CubeSats on interplanetary missions was demonstrated by the Jet Propulsion Laboratory in 2018. They sent two CubeSats, named Mars Cube One (MarCO-a and MarCO-b), to accompany the InSight lander on its journey to the red planet. The CubeSats worked like a charm.

When combined, solar sails and CubeSats could be a powerful tool for exploration.

Continue reading
  280 Hits

To Take the Best Direct Images of Exoplanets With Space Telescopes, we’re Going to Want Starshades

Between 2021 and 2024, the James Webb (JWST) and Nancy Grace Roman (RST) space telescopes will be launched to space. As the successors to multiple observatories (like Hubble, Kepler, Spitzer, and others), these missions will carry out some of the most ambitious astronomical surveys ever mounted. This will range from the discovery and characterization of extrasolar planets to investigating the mysteries of Dark Matter and Dark Energy.

In addition to advanced imaging capabilities and high sensitivity, both instruments also carry coronagraphs – instruments that suppress obscuring starlight so exoplanets can be detected and observed directly. According to a selection of papers recently published by the Journal of Astronomical Telescopes, Instruments, and Systems (JATIS), we’re going to need more of these instruments if we truly want to really study exoplanets in detail.

This Special Section on Starshades includes papers (released between January and June of 2021) that address the latest science, engineering, research, and programmatic advances made with coronographs. Also known as Starshades, these instruments address one of the greatest challenges in exoplanet detection and characterization. To summarize, the vast majority of known exoplanets (4,422 confirmed to date) have been discovered using indirect means.

Exoplanet Studies

Of these methods, the most widely-used and effective are the Transit Method (Transit Photometry) and the Radial Velocity Method (Doppler Spectroscopy). In the former, astronomers monitor stars for periodic dips in brightness, which are a possible indication that an orbiting exoplanet (or several) is passing in front of the parent star (aka. transiting) relative to the observer.

In the latter, astronomers measure how a star moves back and forth (and how fast) to measure the gravitational influence of any satellites in orbit. Separately, these methods are also effective at determining an exoplanet’s radius (Transit) and its mass (Radial Velocity). Used together, they are the most effective means of confirming and characterizing exoplanets, as well as placing constraints on their potential habitability.

Continue reading
  249 Hits

SpaceZE.com