The report from the National Oceanic and Atmospheric Administration came just two days after the US space agency NASA released its climate data, which also found July was a record-breaking month.
"July is typically the hottest month for the globe, and last month didn't disappoint," said a summary of the monthly report by NOAA.
"July 2016 was 1.57 degrees Fahrenheit (0.87 degree Celsius) above the 20th-century average, breaking last year's record for the warmest July on record by 0.11 degrees."
Scientists say the heating trend is being driven by fossil-fuel burning, and is made worse by the ocean warming phenomenon known as El Nino, which came to an end last month.
July's global average of temperatures taken over land and ocean surfaces was the "highest for any month in the NOAA global temperature dataset record, which dates back to 1880."
July also marks the 15th consecutive month of breaking monthly temperature records, "the longest such streak in the 137-year record," NOAA said.
- Scorching Gulf -
The report found above-average warmth across most of the Earth, with new records observed in parts of Indonesia, southern Asia, and New Zealand.
Scorching temperatures were seen in part of the Gulf region, with several locations across Kuwait experiencing temperatures higher than 113 F during July.
"The highest maximum temperature during July 2016 was recorded in Mitribah, Kuwait when temperatures soared to 126.5 F on July 22," it said.
In Bahrain, the average temperature of 96.8 F for the month was the nation's highest July temperature since national records began in 1902.
New Zealand, Spain and Hong Kong were also unusually warm.
Places that saw near-average or cooler-than-normal temperatures last month included the northwestern United States, eastern Canada, southern South America, southwestern Australia, north central Russia, Kazakhstan, and India.
Ocean temperatures were also at a record high, amid concerns that warming waters are contributing to the spread of coral bleaching worldwide.
NOAA said the 13 highest monthly global ocean temperature departures have all occurred in the past 13 months.
Heat records were broken even though El Nino has ended, and neither the warming trend of El Nino or the cooler La Nina prevailed across the tropical Pacific Ocean during July 2016.
La Nina is "slightly favored to develop during August-October 2016, with about 55-60 percent chance of La Nina during the northern hemisphere fall and winter 2016/17," NOAA said.
But even a break in El Nino, which contributed to the surging global temperatures this year, is not likely to sway 2016 from its track toward becoming the hottest year in the contemporary era for global heat.
NOAA said the first seven months of the year are the "warmest such period on record at 1.85 F above the 20th century average."
That is one-third of a degree F above the previous record set in 2015.
A new Concordia study published in the International Journal of Primatology shows that the world's primate populations may be seriously impacted by climate change.
"Our research shows that climate change may be one of the biggest emerging threats to primates, compounding existing pressures from deforestation, hunting and the exotic pet trade, ," says Tanya Graham, the article's lead author and an MSc student in the Department of Geography, Planning and Environment.
She worked with environment professor Damon Matthews from Concordia and primatology post-doctoral researcher Sarah Turner from McGill to assess the exposure and potential vulnerability of all non-human primate species to projected future temperature and precipitation changes.
They found that overall, 419 species of non-human primates - such as various species of lemurs, lorises, tarsiers, monkeys and apes - will experience 10 per cent more warming than the global average, with some primate species experiencing increases of more than 1.5 degrees Celsius in annual average temperature for every degree of global warming.
The researchers also identified several hotspots of primate vulnerability to climate change, based on the combination of the number of species, their endangered status and the severity of climate changes at each location. Overall, the most extreme hotspots, which represent the upper 10 per cent of all hotspot scores, cover a total area of 3,622,012 square kilometres over the ranges of 67 primate species.
The highest hotspot scores occur in Central America, the Amazon and southeastern Brazil, as well as portions of East and Southeast Asia - prime territory for some of the globe's best-known primates who call these areas home.
The ursine howler monkey, black howler monkey, and barbary macaque are expected to be exposed to the highest magnitude of climate change when both temperature and precipitation are considered. For example, the ursine howler monkey, found in Venezuela, will experience an increase of 1.2 degrees Celsius annually and a 5.3 per cent decline in annual rainfall for each degree of global temperature increase.
"This study highlights the vulnerability of individual species, as well as regions in which primates as a whole may be vulnerable to climate change," says Matthews, who will present the findings of this study during the Joint Meeting of the International Primatological Society and the American Society of Primatologists in Chicago later this month.
"Our findings can be taken as priorities for ongoing conservation efforts, given that any success in decreasing other current human pressures on endangered species may also increase that species' ability to withstand the growing pressures of climate changes," says Graham.
"Primates are often seen as flagship species for entire ecosystems, so conservation can have important ramifications for many other species too. I hope our study will help direct conservation efforts for individual primate species in particular, but also for vulnerable ecosystems in general throughout the tropical regions inhabited by non-human primates," adds Turner.
A new paper published online August 4, 2016, in the peer-reviewed journal Proceedings of the National Academy of Sciences (PNAS) describes this previously unobserved flame phenomenon, which burns nearly soot-free.
"Blue whirls evolve from traditional yellow fire whirls. The yellow color is due to radiating soot particles, which form when there is not enough oxygen to burn the fuel completely," said Elaine Oran, Glenn L. Martin Institute Professor of Engineering and co-author of the paper. "Blue in the whirl indicates there is enough oxygen for complete combustion, which means less or no soot, and is therefore a cleaner burn."
The Clark School team initially set out to investigate the combustion and burning dynamics of fire whirls on water. What they discovered was a novel, swirling blue flame that they say could help meet the growing worldwide demand for high-efficiency, low-emission combustion.
"A fire tornado has long been seen as this incredibly scary, destructive thing. But, like electricity, can you harness it for good? If we can understand it, then maybe we can control and use it," said Michael Gollner, assistant professor of fire protection engineering and co-author of the paper.
"This is the first time fire whirls have been studied for their practical applications," Gollner added.
Some oil spill remediation techniques include corralling up the crude oil to create a thick layer on the water surface that can be burned in place, but the resulting combustion is smoky, inefficient, and incomplete.
However, the Clark School researchers say blue whirls could improve remediation-by-combustion approaches by burning the oil layer with increased efficiency, reducing harmful emissions into the atmosphere around it and the ocean beneath it.
"Fire whirls are more efficient than other forms of combustion because they produce drastically increased heating to the surface of fuels, allowing them to burn faster and more completely.
"In our experiments over water, we've seen how the circulation fire whirls generate also helps to pull in fuels. If we can achieve a state akin to the blue whirl at larger scale, we can further reduce airborne emissions for a much cleaner means of spill cleanup," explained Gollner.
Beyond improvements to fuel efficiency and oil spill remediation, there are currently few easy methods to generate a stable vortex in the lab, so the team hopes their discovery of the 'blue swirl' can serve as a natural research platform for the future study of vortices and vortex breakdown in fluid mechanics.
"A fire whirl is usually turbulent, but this blue whirl is very quiet and stable without visible or audible signs of turbulence," said Huahua Xiao, assistant research scientist in the Clark School's Department of Aerospace Engineering and corresponding author of the paper. "It's really a very exciting discovery that offers important possibilities both within and outside of the research lab."
The paper, "From fire whirls to blue whirls and combustion with reduced pollution," Huahua Xiao, Michael J. Gollner, and Elaine S. Oran, was published August 4, 2016, in the journal Proceedings of the National Academy of Sciences (PNAS).
However, new research from George Ban-Weiss, an assistant professor in the Astani Department of Civil and Environmental Engineering at the USC Viterbi School of Engineering, has found that these efforts might have some hidden consequences on local climate.
Ban-Weiss and post-doctoral scholar Pouya Vahmani used a model of the Los Angeles basin to investigate the climate impacts of widespread adoption of drought tolerant vegetation.
Their findings, put forth in the article "Climatic Consequences of Adopting Drought Tolerant Vegetation over Los Angeles as a Response to the California Drought" in the journal Geophysical Research Letters, indicate that in fact, if all lawns were replaced with drought tolerant vegetation, that Angelenos could expect an average daytime warming of 1.3 degrees Fahrenheit due largely to decreased evaporative cooling, as irrigation is stopped.
For the hottest regions of the Los Angeles basin, such as the inland empire and San Fernando valley, the researchers predict a daytime increase in temperature of 3.4 degrees Fahrenheit. Such temperature increases could exacerbate heatwaves, increase photochemical smog production, and increase air conditioning energy use.
However, one effect of widespread planting of drought tolerant vegetation-- which the researchers believe could counteract these higher daytime temperatures-- is an even greater decrease in nighttime temperatures. The researchers forecast that the average nighttime temperature decrease could be as much as 6 degrees Fahrenheit.
Lower nighttime temperatures are important for preventing adverse human health consequences like heat stroke or even death during heat waves, says Ban-Weiss. People, especially vulnerable populations like the elderly, need temperatures to reduce sufficiently at night to allow their bodies to recover from high daytime temperatures and prevent heat-related illness.
"Our interest in this topic was initially piqued because we hypothesized that the reductions in irrigation associated with adopting drought-tolerant vegetation would cause temperature increases," says Ban-Weiss.
"We were surprised to find the reduced temperature signal at nighttime. But this actually has a simple physical explanation, since reducing soil moisture decreases upward heat fluxes from the sub-surface to the surface at night, subsequently reducing surface temperatures."
"Our research highlights how water and climate are intimately coupled," says Ban-Weiss. "You can't change one without effecting the other."
Clarifying the fusion plasma confinement improvement mechanism
Clarifying the fusion plasma confinement improvement mechanism
The right side of the figure where the electric field's absolute value is small corresponds to the L-mode plasma and the left side having large electric field to the H-mode. The black line indicates the experimental value of the electric current and the red line the theoretical value used in the model based upon differences in the trajectories. Image courtesy Tatsuya Kobayashi. For a larger version of this image please go here.
National Institutes of Natural Sciences
by Staff Writers Tokyo, Japan (SPX) Aug 08, 2016 In seeking the realization of the fusion reactor, research on confining high temperature and high density plasma in the magnetic field is being conducted around the world. One of the most important issues in realizing the power generation reactor is the problem that "turbulence," which is the turbulent flow of plasma, causes plasma confinement deterioration. When turbulence*1) exists, plasma with a high core temperature is expelled to outside the plasma, and the condition for producing fusion cannot be achieved.
The key to solving this problem was accidentally discovered on a German experimental device in 1982. There, turbulence in the edge region was suppressed, and the plasma state called the "H-mode"*2), in which the temperature of the entire plasma was raised, was realized. In contrast to this, plasma in which turbulence is great and the temperature is low is called "L-mode" plasma. Subsequently, H-mode plasma was reproduced in devices around the world. H-mode is used as the standard operation mode in the ITER.
In research to date, many researchers have attempted to clarify the H-mode mechanism. In the normal condition in which there is a balance of ions and electrons, there cannot be a strong electric field in a plasma. This condition corresponds to the L-mode. According to theory, in the H-mode, due to a small deviation in the distribution of ions and electrons, in the plasma's edge region a strong electric field is generated, and suppression of the turbulence was predicted.
Subsequently, using the most advanced diagnostics at that time, the structure of the electric field that theoretical researchers had predicted was shown to actually exist. How does this structure of the electric field form was the remaining riddle. Regarding this riddle, research has been advancing since the discovery of the H-mode, and this issue was not solved for more than thirty years, until now.
Here, the research group of professors Tatsuya Kobayashi, Kimitaka Itoh, and Takeshi Ido of NIFS engaged in collaborative research with the National Institutes for Quantum and Radiological Science and Technology (QST) and Kyushu University. Using the "Heavy Ion Beam Probe" developed at NIFS they measured the plasma potential of the QST's JFT-2M tokamak. Resulting from their analysis of experimental data they discovered the electric field generation mechanism which had been a riddle for the past thirty years.
The "deviation" in the distribution of ions and electrons producing the strong electric field is made by the electric current that flows in the radial direction. The mechanism that produces this electric current has been proposed numerous times, but which effect was especially important had not yet been clarified.
Through their measurements and analysis, they found that the effects born from the differences in the trajectories of electrons and ions play a particularly important role in the generation of the electric current.
Although the experimental data used in these experiments were collected in 1999, they produced cutting edge results from recent theoretical physics and developments in analytical methods. That results from seventeen years earlier have contributed to advances in plasma physics today indicates the high quality of this experimental data.
These research results are widely available in Scientific Reports (online edition), an academic science journal published in Great Britain by the Nature Publishing Group. The article is dated August 4, 2016, and is widely available.
Figure 1 shows the dependence of the electric current in the radial direction (Jr) on the electric field (Er). The right side of the figure where the electric field's absolute value is small corresponds to the L-mode plasma and the left side having large electric field to the H-mode.
The black line indicates the experimental value of the electric current and the red line the theoretical value used in the model based upon differences in the trajectories. In this experiment, when the plasma changes from L-mode to H-mode the theoretical value well matches the experimental value. From this, the mechanism by which the electric current is generated becomes clear. However, the electric current value of the L-mode indicates a large difference between the theoretical value and the experimental value. In order to clarify the cause of this, further research is necessary.
Figure 2 shows the temperature profiles of the L-mode plasma and the H-mode plasma. In the H-mode, the temperature increased in the edge region, and in the whole plasma a high temperature was achieved.
Explanation of Terminology:
*1)When plasma is heated, small spiral flows are generated naturally, and the rise of temperature is hindered. Because these small spiral flows move in various directions and are of different sizes, these movements are called "turbulent flows." When a turbulent flow is generated, a core plasma with a high temperature rides on an eddy in the turbulent flow and is expelled out of the plasma. As a result, the turbulent flow hinders the increase of the plasma's temperature.
*2) H-mode is the high-confinement mode that means "a good confinement condition," and was discovered in the German experimental device ASDEX. When a plasma is not in H-mode (L-mode stands for low confinement mode), even when power for heating the plasma is increased, it is difficult to raise the temperature because of the existence of turbulence, and conditions for achieving fusion cannot be met.
On the other hand, when a plasma is in H-mode because of suppressing turbulence in the plasma's edge region by the electric field the temperature throughout the plasma can be raised. Thus, the H-mode, as a key for achieving success with the future fusion reactor, these principles and applications are being researched around the world.
Dobrowski and co-author Sean Parks - a scientist at the U.S. Department of Agriculture, Forest Service Aldo Leopold Research Institute and a UM alumnus - propose a new method to model how fast and where organisms will need to move to keep pace with climate change.
Mountains support roughly a quarter of the globe's terrestrial biodiversity, contain about a third of its protected areas and house nearly half of the world's biodiversity hotspots.
One reason for this biodiversity is that complex topography within mountains creates diverse climates within close proximity to one another.
One way scientists measure how vulnerable a site is to climate change is to estimate how far organisms at that site need to move to maintain a consistent temperature as the Earth warms.
The diversity of climates in mountain landscapes means that when temperatures rise, organisms might have to only move a short distance to get to a cooler home.
However, Dobrowksi and Parks show that measuring the0 distance from one area of suitable climate to the next doesn't account for the resistance organisms will encounter as they traverse areas with very different climates, like a warm valley between two mountain peaks.
"It's not enough to just measure how far an organism will have to move in order to keep up with climate change," Dobrowski said.
"We also need to look at how much organisms will be exposed to dissimilar climates along the way. Once we do that, we find that even short movements in mountainous areas expose organisms to large climate differences. This may prevent plants and animals from being able to maintain a suitable climate as the earth warms."
Dobrowski and Parks suggest that areas within mountains are more climatically isolated and thus more vulnerable to climate change than previously reported.
"Climate change velocity underestimates climate change exposure in mountainous regions" was published Aug. 1 in Nature Communications.
Drs. Min Liu and Yuanjie Pang, along with a team of graduate students and post-doctoral fellows in U of T Engineering, have developed a technique powered by renewable energies such as solar or wind. The catalyst takes climate-warming carbon-dioxide (CO2) and converts it to carbon-monoxide (CO), a useful building block for carbon-based chemical fuels, such as methanol, ethanol and diesel.
"CO2 reduction is an important challenge due to inertness of the molecule," says Liu. "We were looking for the best way to both address mounting global energy needs and help the environment," adds Pang. "If we take CO2 from industrial flue emissions or from the atmosphere, and use it as a reagent for fuels, which provide long-term storage for green energy, we're killing two birds with one stone."
The team's solution is sharp: they start by fabricating extremely small gold "nanoneedles" - the tip of each needle is 10,000 times smaller than a human hair. "The nanoneedles act like lightning rods for catalyzing the reaction," says Liu.
When they applied a small electrical bias to the array of nanoneedles, they produced a high electric field at the sharp tips of the needles.
This helps attract CO2, speeding up the reduction to CO, with a rate faster than any catalyst previously reported. This represents a breakthrough in selectivity and efficiency which brings CO2 reduction closer to the realm of commercial electrolysers. The team is now working on the next step: skipping the CO and producing more conventional fuels directly.
Their work is published in the journal Nature.
"The field of water-splitting for energy storage has seen rapid advances, especially in the intensity with which these reactions can be performed on a heterogeneous catalyst at low overpotential - now, analogous breakthroughs in the rate of CO2 reduction using renewable electricity are urgently needed," says Michael Graetzel, a professor of physical chemistry at Ecole Polytechnique Federale de Lausanne and a world leader in this field.
"The University of Toronto team's breakthrough was achieved using a new concept of field-induced reagent concentration."
"Solving global energy challenges needs solutions that cut across many fields," says Sargent. "This work not only provides a new solution to a longstanding problem of CO2 reduction, but opens possibilities for storage of alternative energies such as solar and wind."
Scientist Robert Field of NASA's Goddard Institute for Space Studies in New York and colleagues compared the fire season to those of previous years in Indonesia, using NASA satellite data since the 2000s and Indonesian airport visibility beginning in the 1990s.
The length and severity of the fire season in Indonesia is strongly influenced by the amount of rainfall during the dry season, Field said. As a result of the recent El Nino, 2015 rainfall was low enough for fires to spread underground into drained, degraded peat swamps, where they burn longer.
Over the two main burning regions of Sumatra and Kalimantan (Indonesian Borneo), fire activity and smoke emissions were the largest recorded over the region by NASA satellite instruments most of which have been collecting data since the early 2000s. Data from five instruments on the NASA Aura, Terra, and Aqua satellites tracked active fires, carbon monoxide, and aerosol optical depth in the atmosphere.
"One of the instruments saw a plume of carbon monoxide from the fires stretching halfway around the world at the equator at about 12km," Field said.
By looking at airport visibility going back to the 1990s, Field and colleagues were able to compare 2015 to severe events in 1991, 1994 and 1997. In Kalimantan, 2015 visibility reductions ranked lower than 1997 because it was much less dry.
Compared to 1991 and 1994, however, the 2015 haze from the smoke created by the fires was more severe but conditions were no drier. This suggests that Kalimantan's susceptibility to fires has gone up, continuing a trend seen since the 1980s with intensified land use and increased human-induced fires.
Across the multiple data sources, Field and colleagues identified the seasonal rainfall level below which severe is likely - an average of 4mm/day. "Many years, even during the dry season, it's still too wet for any serious burning to happen," he said.
"Knowing that rainfall threshold, you can look at a seasonal rainfall assessment and have a sense of whether it's going to be a normal or a bad fire year and plan ahead."
Eliminating fire from degraded peatlands is a long-term goal that will require changes in land use and land tenure as Indonesia undergoes economic development. In the short term, fire prevention, suppression, and mitigation measures should be tied to early warning triggers.
The environmental ranges of many animal and plant species are starting to alter with climate change, as temperatures change and force species to migrate to more suitable climes.
To be able to do this successfully, they will need sufficient habitat in their existing range, their future range, and any intermediate areas to enable populations to survive and thrive. Many conservation initiatives to restore habitats and increase connectivity are trying to address this issue. However, existing modelling tools mainly treat the landscape as static, and it is difficult to use these to plan restoration.
In a new paper published in Methods in Ecology and Evolution, researchers proposed and tested two new computational methods that could help optimise this planning process.
Lead author, Dr Jenny Hodgson explained: "Sites across a wide area can be thought of as an 'ecological network' and to be really effective these networks need to be bigger, better and more joined up.
"We take the most essential features of the landscape configuration, and how species can reproduce and disperse, and we translate these into equations, using insights from physics.
"The solutions to these equations show us which existing patches are best for facilitating range shifts, and which gaps between patches are acting as serious bottlenecks. Using these values, the computer can automatically add new habitat and rearrange the landscape in order to improve range shifting potential."
Some of the methods developed in this new study are already implemented in a freely-available decision support tool called Condatis. The software, which was developed by the team as part of a knowledge exchange project, identifies the best locations for habitat creation and restoration to improve connectivity across landscapes.
Dr Hodgson said: "We are excited by these findings because we think they bridge the gap between methods for visualising connectivity pathways and multi-purpose conservation optimisation tools.
"For the moment we are demonstrating our new methods in simplified scenarios. The next step will be to think about optimising them for multiple species in multiple directions, and about prioritising for long-term survival in one place as well as the speed of range shifting."
Condatis, which launched last year, is already being used by organisations including Buglife, the Yorkshire Dales National Park and Warwickshire County Council.
Dr Hodgson added: "Our knowledge exchange network has expanded greatly since Condatis was first launched. Work on the underpinning science and user-friendly features that practitioners need will continue in parallel in the years ahead. We benefit hugely from our partners' enthusiasm and willingness to try new things."
The paper "How to manipulate landscapes to improve the potential for range expansion" is published in Methods in Ecology and Evolution (DOI: 10.1111/2041-210X.12614)
Prime Minister Malcolm Turnbull -- a former environment minister -- has long been seen as a supporter of action on climate change but his government oversaw bitterly opposed cuts to national science body CSIRO earlier this year.
But following the government's narrow re-election last month, new Science Minister Greg Hunt stressed both he and Turnbull had "clear and strong views" about the importance of climate science.
"It's critical to our long-term planning... so climate science will be a bedrock function for research of the CSIRO," Hunt told national broadcaster ABC.
"It's a new government and we're laying out a direction that climate science matters -- and that's both the science relating to the long-term trend, the long-term influences, where the impacts are, as well as mitigation."
The conservative government's commitment to fighting global warming has been under international scrutiny since former prime minister Tony Abbott said evidence blaming mankind for climate change was "absolute crap" when he was opposition leader.
With its heavy use of coal-fired power and relatively small population of 24 million, Australia is considered one of the world's worst per capita greenhouse gas polluters.
Hunt, who was environment minister for a decade before moving to the science portfolio after last month's national elections, said 15 new jobs would be created and Aus$37 million (US$28 million) injected into climate research at CSIRO over the next decade.
CSIRO's chief executive Larry Marshall announced earlier this year that 275 roles would be cut from the national science body, including 75 in the oceans and atmosphere division and 70 in the land and water section.
The cuts were slammed by environmental campaigners, while almost 3,000 international scientists signed an open letter in February calling the move alarming and a reflection of the lack of insight about Australia's importance to global and regional climate research.
Greenpeace welcomed Hunt's announcement, adding that they hoped it was the "start of more positive leadership from the Turnbull government on the critical threat of climate change".
"The decision to cut climate science jobs at CSIRO during such a critical time for Australia's climate was preposterous, so this partial u-turn is a welcome move," Greenpeace Australia's senior climate and energy campaigner Nikola Casule said in a statement.
A dire picture of the Earth's health is painted in the State of the Climate report, a peer-reviewed 300-page tome that comes out once a year and is compiled by 450 scientists from around the world.
The record heat that the planet experienced last year was driven partially by global warming, and was exacerbated by the ocean heating trend known as El Nino, it said.
El Nino, which just ended in July, was one of the strongest the Earth has seen "since at least 1950," said the report, led by the National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information.
Thomas Karl, director of the NOAA division, described the report as an "annual physical" of the Earth's health.
"Clearly, the report in 2015 shows not only that the temperature of the planet is increasing, but all the related symptoms that you might expect to see with a rising temperature are also occurring," he told reporters on a conference call.
"El Nino certainly gave it a boost, so to speak, from the standpoint of global temperatures."
Adding to the health metaphor, the report also featured a haiku by co-author Gregory Johnson, an oceanographer with NOAA, who wrote:
"El Nino waxes,
warm waters shoal, flow eastward,
Earth's fever rises."
- New records -
Major concentrations of greenhouse gases -- including carbon dioxide (CO2), methane and nitrous oxide -- are the by-products of fossil fuel burning.
All three "rose to new record high values during 2015," said the findings, based on tens of thousands of measurements from multiple independent datasets.
The annual average atmospheric carbon dioxide (CO2) concentration at Mauna Loa, Hawaii, reached 400.8 parts per million (ppm), surpassing 400 ppm for the first time, marking "the largest annual increase observed in the 58-year record."
On average globally, 2015's CO2 level was 399.4 ppm, an increase of 2.2 ppm over 2014.
This "means that 2016 is easily going to surpass this milestone," said climatologist Jessica Blunden, lead editor at NOAA's National Centers for Environmental Information.
The report also confirmed the NOAA and NASA finding that Earth's average land and ocean surface temperatures warmed to record levels in 2015.
Karl and Blunden said experts foresee 2016 will set a new record for global heat.
"Just because the El Nino has ended does not mean that we are going to go back to where we were before. We are going to continue to climb," said Blunden.
Global sea levels swelled to their highest point yet, about 70 millimeters (about 2.75 inches) higher than the 1993 average.
Sea level is creeping up gradually around the globe, averaging about 3.3 millimeters per year, said the report.
Some places in the western Pacific and Indian Ocean are seeing waters rise faster.
Even though the current pace may appear slow, experts warn that sea level rise will accelerate in the coming decades as glaciers and polar ice caps melt, putting millions of lives at risk in coastal communities around the world.
- More extremes -
More extreme weather was seen in 2015, too, with an above-normal rainy season prompting major floods in some parts of the world.
Meanwhile, areas in severe drought nearly doubled, from eight percent of the planet in 2014 to 14 percent in 2015.
The Arctic, which is considered particularly sensitive to climate change, continued to warm, and increasing temperatures led to thinner and smaller sea ice cover.
"The Arctic land surface temperature tied with 2007 and 2011 as the highest since records began in the early 20th century, representing a 2.8 Celsius (5 Fahrenheit) increase since that time," said the report.
The Antarctic was colder than average, and the influence of El Nino on atmospheric circulation helped shift sea ice cover "from record high levels in May to record low levels in August," it said.
Across the globe, alpine glaciers continued to retreat for the 36th year in a row.
June's late spring snow cover in the northern hemisphere marked the second lowest in the 49-year satellite record.
Warming waters are also blamed for the severity of a widespread algal bloom last summer that stretched from central California to British Columbia, Canada, resulting in "significant impacts to marine life, coastal resources and the human communities that depend on these resources."
The Atlantic hurricane season was unusually mild for the second year in a row, largely due to El Nino, but tropical cyclones "were well above average overall," said the report.
There were 101 tropical cyclones across all ocean basins in 2015, well above the 1981-2010 average of 82 storms.
The eastern and central Pacific was roiled by 26 big storms, the most since 1992.