Google Search

For weather information from across the nation, please check out our home site National Weather Outlook. Thanks!

Washington DC Current Conditions

Washington DC Weather Forecast

Washington DC 7 Day Weather Forecast

Washington DC Metro Weather Radar

Showing posts with label Scientists. Show all posts
Showing posts with label Scientists. Show all posts

Monday, May 19, 2014

Predicting climate: Scientists test periodic-to-decadal conjecture

In new research released in Tellus A, Francois Counillon and co-authors in the Bjerknes Center are testing periodic-to-decadal conjecture.

In the Bjerknes Center, scientists are exploring the opportunity of periodic to decadal climate conjecture. This can be a area still in the infancy, along with a first attempt is made public for that latest Intergovernmental Panel on Global Warming (IPCC) report.

Aside from a couple of isolated regions, conjecture skill was moderate, departing room for improvement. In new research released in Tellus A, periodic-to-decadal conjecture is examined by having an advanced initialisation way in which has shown effective in weather predicting and operational oceanography.

"Regular" climate forecasts are made to represent the persistent change caused by exterior forcings. Such "forecasts" begin with initial problems that are distant from present day climate and therefore neglect to "predict" the entire year-to-year variability and the majority of the decadal variability -- like the pause within the global temperature increase (hiatus) or even the spate of harsh winter within the northern hemisphere. In comparison, weather forecasts depend positioned on the precision of the initial condition because the influence from the exterior forcing is nearly imperceptible.

For periodic-to-decadal time scales both initial condition and also the exterior forcing influence the conjecture. Beginning an environment conjecture from a preliminary condition nearer to the actual weather conditions are therefore essential to yield better conjecture than accounting just for exterior forcing. Within our region of great interest, decadal skill might be accomplished by enhancing the representation from the warmth content transiting in to the Nordic Ocean and as a result is going to influence the precipitation and temperature over Scandinavia.

The technique used to initialise/ correct a dynamical product is known to as data assimilation. It estimations the first condition of the model knowing some sparse findings (a smaller amount than 1% from the sea variables are observed). Rapport between your findings and also the non-observed variables should be found to broaden the corrections.

In addition, the corrections must fulfill the model dynamics to prevent abrupt changes throughout the forecast. The Ensemble Kalman Filter uses statistics from an ensemble of forecasts to estimate the connection between your findings and all sorts of variables for his or her correction. This process is computationally intensive because it requires parallel integrations from the model however it guarantees the relationship evolve using the system, which the corrections fulfill the dynamics from the model.

The Norwegian climate conjecture model (NorCPM) combines the Norwegian Earth System model using the Ensemble Kalman Filter. Over time, we plan to perform retrospective decadal forecasts (hindcasts) during the last century, to check the ability of our bodies on disparate phases from the climate and reveal the relative need for internal and exterior influences on natural climate variability, including the value of feedback systems. Ocean surface temps (SST) would be the only findings readily available for this type of lengthy time period and will also be employed for initialisation.

Our study looks into the possibility abilities of putting together SST only, utilizing an idealised framework, i.e. in which the synthetic option would be obtained from exactly the same model at different occasions. This framework enables a comprehensive validation since the full option would be known and our bodies could be examined from the upper predictive skill (the situation where findings could be available absolutely everywhere). NorCPM shown decadal of a routine for that Atlantic meridional knocking over and warmth content within the Nordic Seas which are near to the model's limit of of a routine. Although these answers are encouraging, the idealised framework assumes the model is ideal minimizing skill is anticipated inside a real framework. This verification is presently ongoing.


View the original article here

Sunday, February 16, 2014

Scientists target ocean level rise in order to save many years of historical evidence

Prehistoric spend mounds available on a number of Florida's most pristine beaches are vulnerable to washing away because the ocean level increases, wiping away 1000's of many years of historical evidence.

"The biggest risk of these ancient treasure troves of knowledge is ocean level rise," stated Shawn Cruz, a senior research connect using the Center for Sea-Atmospheric Conjecture Studies at Florida Condition College.

But some pot project between Cruz and also the National Park Services are drawing focus on the issue to hopefully minimize the outcome around the state's cultural sites.

Cruz and Margo Schwadron, an archaeologist using the National Park Service, have launched into a task to look at past and future alterations in climate and just how we are able to adjust to individuals changes in order to save regions of shoreline and therefore preserve cultural and ancient evidence.

"We are type of the pioneers in searching in the cultural focus of the problem," Cruz stated, observing that many weather and sea experts are worried about city infrastructure for seaside areas.

To accomplish the work, the nation's Park Service granted Cruz a $30,000 grant. With this money, Cruz and former Florida Condition College undergraduate Marcus Manley spent hrs producing modern, colonial and paleo weather data.

The main focus of the initial scientific studies are the Canaveral National Seashore and Everglades National Park, which have prehistoric spend mounds, about 50 ft to 70 ft high. Scientists believe these spend mounds offered as fundamentals for structures and pay outs and then offered as navigational landmarks throughout European search for the location.

Modern temperature and storm system information was readily available to scientists. But, to visit 100s after which 1000's of in the past required a rather different approach.

Log books from old The spanish language forts in addition to ships that entered the Atlantic needed to be examined to obtain the missing information.

The end result would be a comprehensive data looking for the location, so detailed that modern era conditions are available these days on an hourly basis.

Cruz and Schwadron are attempting to secure more funding to carry on the work they do, but for the time being, they're making their data set open to everyone along with other scientists hoping raising awareness concerning the unpredicted results of ocean level rise.

The Nation's Park Service has additionally released a sales brochure on global warming and also the impact that ocean level rise might have around the spend mounds available at Cape Canaveral.


View the original article here

Monday, September 9, 2013

Scientists find large Gulf dead zone, but smaller than predicted

July 29, 2013

 2013 hypoxia area on the Louisiana Gulf of Mexico shelf

Map showing the hypoxia area on the Louisiana Gulf of Mexico shelf in 2013.

Download here (Credit: LUMCON (Rabalais))

NOAA-supported scientists found a large Gulf of Mexico oxygen-free or hypoxic “dead” zone, but not as large as had been predicted. Measuring 5,840 square miles, an area the size of Connecticut, the 2013 Gulf dead zone indicates nutrients from the Mississippi River watershed are continuing to affect the nation’s commercial and recreational marine resources in the Gulf.

“A near-record area was expected because of wet spring conditions in the Mississippi watershed and the resultant high river flows which deliver large amounts of nutrients,” said Nancy Rabalais, Ph.D. executive director of the Louisiana Universities Marine Consortium (LUMCON), who led the July 21-28 survey cruise. “But nature’s wind-mixing events and winds forcing the mass of low oxygen water towards the east resulted in a slightly above average bottom footprint.”

Hypoxia is fueled by nutrient runoff from agricultural and other human activities in the watershed. These nutrients stimulate an overgrowth of algae that sinks, decomposes and consumes most of the oxygen needed to support life. Normally the low or no oxygen area is found closer to the Gulf floor as the decaying algae settle towards the bottom. This year researchers found many areas across the Gulf where oxygen conditions were severely low at the bottom and animals normally found at the seabed were swimming at the surface.

 2013 hypoxia area on the Louisiana Gulf of Mexico shelf

Graph showing historical hypoxia trends.

Download here (Credit: LUMCON (Rabalais))

This is in contrast to 2012, when drought conditions resulted in the fourth smallest dead zones on record, measuring 2,889 square miles, an area slightly larger than Delaware. The largest previous dead zone was in 2002, encompassing 8,481 square miles. The smallest recorded dead zone measured 15 square miles in 1988. The average size of the dead zone over the past five years has been 5,176 square miles, more than twice the 1,900 square mile goal set by the Gulf of Mexico / Mississippi River Watershed Nutrient Task Force in 2001 and reaffirmed in 2008.

On June 18, NOAA-sponsored forecast models developed by Donald Scavia, Ph.D., University of Michigan, and R. Eugene Turner, Ph.D., Louisiana State University,  predicted the Gulf hypoxic zone would range in size from 7,286 to 8,561 square miles.

“NOAA’s investment in the Gulf of Mexico continues to yield results that confirm the complex dynamics of hypoxia and provide managers and the public with accurate scientific information for managing and restoring the nation's valuable coastal resources,” said Robert Magnien, Ph.D., director of NOAA’s Center for Sponsored Coastal Ocean Research. “For those who depend upon and enjoy the abundant natural resources of the Gulf of Mexico, it is imperative that we intensify our efforts to reduce nutrient pollution before the ecosystem degrades any further.”

This annual measurement provides federal and state agencies working on the 2008 Gulf task force implementation actions with the real consequences of inadequate nutrient pollution management. The task force’s actions are set for review this summer.

The hypoxic zone off the coast of Louisiana and Texas forms each summer threatening the ecosystem supporting valuable commercial and recreational Gulf fisheries that in 2011 had a commercial dockside value of $818 million and an estimated 23 million recreational fishing trips. The Gulf task force, in its 2008 report, states that "hypoxia has negative impacts on marine resources." It further states that research on living resources in the Gulf show long term ecological changes in species diversity and a large scale, often rapid change, in the ecosystem's food-web that is both "difficult and impossible to reverse." Additionally, there are numerous annual areas of the Gulf where large scale fish kills occur as a result of hypoxia.

Two surveys conducted in June and early July, one of which was led by a NOAA-supported Texas A&M University team, suggested a large hypoxic zone was forming in the Gulf, though the LUMCON July measurement will be the official one as required of NOAA by the Task Force. NOAA’s National Marine Fisheries Service, in conducting its Southeast Monitoring and Assessment Program groundfish surveys, also found large expanses of hypoxia in June-early July. Texas A&M will be conducting a follow-up cruise in mid-August to provide its final seasonal update.

Visit the Gulf Hypoxia web site for additional graphics and information concerning this summer’s LUMCON research cruise, and previous cruises.

NOAA’s National Ocean Service has been funding monitoring and research for the dead zone in the Gulf of Mexico since 1985 and currently oversees the NGOMEX program, the hypoxia research effort for the northern Gulf which is authorized by the Harmful Algal Bloom and Hypoxia Research and Control Act.

The National Centers for Coastal Ocean Science is the coastal science office for NOAA’s National Ocean Service.

NOAA’s mission is to understand and predict changes in the Earth's environment, from the depths of the ocean to the surface of the sun, and to conserve and manage our coastal and marine resources. Join us on Facebook, Twitter and our other social media channels.


View the original article here

Monday, February 20, 2012

Scientists a step closer to predicting tornadoes

For decades, meteorologists have been able to forecast the severity of hurricane seasons several months ahead of time. Yet forecasting the likelihood of a bad tornado season has proved a far greater challenge.

Brenna Burzinski looks through the rubble in her devastated apartment in Joplin, Mo., on May 25. By Charlie Riedel, AP

Brenna Burzinski looks through the rubble in her devastated apartment in Joplin, Mo., on May 25.

By Charlie Riedel, AP

Brenna Burzinski looks through the rubble in her devastated apartment in Joplin, Mo., on May 25.

Now, research from scientists at Columbia University's International Research Institute for Climate and Society could eventually lead to the first seasonal tornado outlooks.

"Understanding how climate shapes tornado activity makes forecasts and projections possible, and allows us to look into the past and understand what happened," said Michael Tippett, lead author of a study in February's journal of Geophysical Research Letters.

The need for such data is reinforced by the still-fresh memory of 550 Americans killed by tornadoes last year — coupled with an unusually violent January for twisters.

In the study, Tippett and his team looked at 30 years of past climate data. They used computer models to determine that the two weather factors most tied to active tornado months and seasons were heavy rain from thunderstorms and extreme wind shear (wind blowing from different directions at different layers of the atmosphere).

"If, in March, we can predict average thunderstorm rainfall and wind shear for April, then we can infer April tornado activity," Tippett says.

The method worked for each month except for September and October, and it worked best in June.

This is the first time a forecast of up to a month in advance has been demonstrated, he says.

"A connection between La NiƱa and spring tornado activity is often mentioned," Tippett says, "but such a connection really has not been demonstrated in the historical data and hasn't been shown to provide a basis for a skillful tornado activity forecast.

"Our work bridges the gap between what the current technology is capable of forecasting (large-scale monthly averages of rainfall and winds) and tornado activity, which the current technology cannot capture," he says.

The research isn't ready for prime time yet, however, so no official forecast will be made for the upcoming season using these methods.

"This is a useful first step," says Harold Brooks of the National Oceanic and Atmospheric Administration, who was not involved in the study. He says it will be helpful to know, for example, that sometime in the last week of April, conditions will be favorable for lots of tornadoes in the eastern USA.

With greater lead time, a state emergency planner "could be better prepared with generators and supplies," Brooks says.

For more information about reprints & permissions, visit our FAQ's. To report corrections and clarifications, contact Standards Editor Brent Jones. For publication consideration in the newspaper, send comments to letters@usatoday.com. Include name, phone number, city and state for verification. To view our corrections, go to corrections.usatoday.com.

View the original article here