Connect with us


Can Morocco resolve Europe’s energy crisis?



Morocco plans to supply renewable energy to Europe, but some want it to keep it for its home market.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *


More than a quarter of a billion people are going hungry in 2022



CLIMATE WIRE | LONDON. More than 250 million people faced severe hunger last year, with people in seven countries on the verge of starvation, according to a United Nations report released Wednesday.

The fallout from Russia’s war in Ukraine, including high food prices, has been a major cause of famine, especially in the world’s poorest countries that are still reeling from the economic turmoil caused by Covid-19. Conflict was another key factor, along with climate change and extreme weather events.

The report says that in 2022, about 258 million people in 58 countries or territories faced a crisis or worsening severe food insecurity – a sharp increase from 193 million in 2021.

Last year was the fourth year in a row that the number of people in dire need of food assistance increased.Global Report on Food Crisesreleased on Wednesday by the Food and Agriculture Organization of the United Nations, the World Food Programme, the World Bank, the European Union, the United States and members of the Global Network Against Food Crises.

“The latest data on the global situation of severe food insecurity paint a very worrying picture. They tell us that for four reports in a row, four years in a row, things are getting worse,” said Rein Paulsen, Director of FAO’s Office of Emergencies and Resilience.

Severe food insecurity is a famine that endangers people’s livelihoods and lives and threatens to turn into famine and cause massive loss of life.

About 376,000 people in seven countries – Afghanistan, Burkina Faso, Haiti, Nigeria, Somalia, South Sudan and Yemen – face ‘catastrophic’ food security conditions, which are the last stage before famine in the international food security rating system, the report says. .

Conflict was the main cause of famine in 19 countries, while weather and climate extremes were the main cause of severe food insecurity for 57 million people in 12 countries, including Pakistan, where devastating floods destroyed crops and left millions without food last year. .

War and pestilence

However, the main reason for the increase in food insecurity last year was the economic turmoil from Russia’s war in Ukraine and the residual impact of Covid-19. Their consequences were the main cause of severe famine in 27 countries, affecting 84 million people.

Russia’s invasion of Ukraine – one of the world’s largest exporters of wheat, corn and sunflower products – in February 2022 shocked global trade in those products, as well as fertilizers, causing food prices to skyrocket.

Pressure on prices has eased since last July, a UN-Turkey brokered agreement allowing Ukrainian grain exports to safely bypass the Russian blockade in the Black Sea. But the war continues to affect food security indirectly, especially in poor countries dependent on imports, the report warns.

Russia is also threatening to pull out of the Black Sea grain deal when an extension agreed in March ends on May 18. This could lead to another price hike, pushing millions of people to starvation.

According to Paulsen, the report is a wake-up call. It is “vital because it informs decision makers, informs donors, informs operational agencies and governments of the need for urgent action, extended action and the right action to address the situation,” he said.

What is needed, Paulsen added, is a shift from delivered food aid to agricultural interventions that have “proven to be the most cost-effective way to respond to acute food insecurity.”

“Globally, we know that only 4 percent of all funding that goes to food security measures in the context of food crises, in the context covered by this report, goes to emergency agriculture,” he said. “This is something that needs to change if we really want to reverse the downward trend in these existing numbers.”

This story first appeared in POLITICO Europe.

Reprinted from News from Europe and Europe with permission from POLITICO, LLC. Copyright 2023. E&E News provides important news for energy and environmental professionals.

Continue Reading


10 Scientific Words You’re Probably Using Wrong



If you have a theory that potato chips make you fat (of which your waistline is the proof), you have just used two scientific terms in a very unscientific manner.

Continue Reading


US lawmakers say AI will never be able to launch nuclear weapons



Rep. Tim Lew (D-CA) last week introduced V Block nuclear launch with the Autonomous AI Act along with Senator Edward Markey (D-MA) and many other bipartisan co-sponsors. The purpose of the bill is as simple as its name: to ensure that AI never has a say in American nuclear strategy.

“While we are all trying to cope with the pace at which AI is accelerating, the future of AI and its role in society remains unclear. Our job as members of Congress is to exercise responsible foresight when it comes to protecting future generations from potentially devastating consequences,” said Rep. Lew in his remarks. bill announcementadding, “AI can never replace human judgment when it comes to launching nuclear weapons.”

He’s not the only one who thinks so – 2021. Human Rights Watch report co-authored with the International Clinic for Human Rights Harvard Law School stated that “[r]Robots lack the compassion, empathy, mercy, and judgment necessary to treat humans humanely, and fail to understand the inherent value of human life.”

[Related: This AI-powered brain scanner can paraphrase your thoughts.]

If passed, the bill would legally codify existing Department of Defense procedures established in its 2022 year. Nuclear Policy Reviewwhich states that “in all cases, the United States will keep the individual ‘informed’ of all actions critical to informing and carrying out the President’s decisions to begin and stop the use of nuclear weapons.” funds can be used to launch nuclear weapons via an automated system without “significant human control”, according to bill announcement.

The proposed law comes at a time when the capabilities of generative AI, including chatbots such as ChatGPT, are increasingly becoming part of the public discourse. But the surreal spectrum between “funny chatbot responses” and “potential existential threats to humanity” didn’t escape Lew’s attention. He certainly never thought that part of his civic responsibilities would include drafting legislation to prevent the Skynet scenario, he says. PopSci.

Lew says he is “reclaiming the computer science major” as he calls himself, and he’s amazed at what AI programs are now capable of. “Voice recognition is pretty amazing right now. Facial recognition is pretty amazing right now, although it’s more inaccurate for people with darker skin,” he says, referring to long-documented patterns of algorithmic bias.

However, the release of generative AI programs last year, such as OpenAI GPT-4, was the moment Lew began to see the potential harm.

[Related: ‘Godfather of AI’ quits Google to talk openly about the dangers of the rapidly emerging tech.]

“It’s about creating information and predicting scenarios,” he says of the technologies available. “This leads to various concerns, including my view that AI, no matter how smart, should never have operational control over a nuclear weapon.”

Lew believes it is vital to start a discussion of AI rules in order to reduce three main consequences: first, the spread of misinformation and other content “harmful to society.” The second is the containment of AI, which, while not a threat to humanity, “still might just kill you.” He refers to a multi-car accident in San Francisco in November 2022 that injured several people and was allegedly caused by Tesla using its controversial autonomous driving mode on autopilot.

“When your cell phone breaks down, it doesn’t go 50 miles an hour,” he says.

Finally, there is “an AI that can literally destroy the world,” Liu says. And this is where he believes the Autonomous AI Nuclear Launch Interdiction Act can help, at least to some extent. In essence, if the bill becomes law, artificial intelligence systems will still be able to provide analysis and strategic proposals regarding nuclear events, but the final decision will remain with the person.

[Related: A brief but terrifying history of tactical nuclear weapons.]

Going forward, Lew says a broader approach to AI regulation is needed because Congress “doesn’t have the bandwidth or capacity to regulate AI on a per-application basis.” It is open to a set of AI risk standards agreed by federal agencies, or perhaps a separate agency dedicated to generative and future advanced AI. On Thursday, the Biden administration publicized plans offer $140 million in funding for new research centers dedicated to monitoring and regulating AI development.

Asked if he fears that society will face a new “AI arms race,” Lieu acknowledges that it is “certainly possible” but points to the existence of existing nuclear treaties. “Yes, there is a nuclear arms race, but it is not [currently] total arms race. Thus, there may not be an all-out arms race using AI,” Liu says.

“Countries are looking at this and hopefully they will come together to say, ‘Here are some things we will not allow AI to do.'”

Continue Reading


Copyright © 2023 Millennial One Media.