Connect with us

Technology

Photoncycle aims to store energy cheaply using a clever hydrogen solution

Published

on

The solar energy sector has been fighting interseasonal energy storage for years. The ability to harness surplus solar energy through the summer months for winter use stays an elusive goal, with existing solutions equivalent to batteries becoming insufficient due to prohibitive costs and limited lifespan. Meanwhile, hydrogen, despite its clean-burning properties, has been sidelined due to inefficiency and high costs.

Photoncycle — a startup emerging from the depths of an accelerator on the Oslo Science Park in Oslo, Norway — is working on a solution. Startup claims that with a vision as clear because the summer sun, solid hydrogen technology can store energy more efficiently ammonia synthesis reactor. The claim is that this technology provides more economical storage than every other battery or liquid hydrogen solution available on the market.

A diagram showing Photoncycle’s vision of a complete system installed in a home. Image credits: Photocycle

“Lithium-ion batteries use expensive metals. Our material is super cheap: storing 10,000 kilowatt hours costs about $1,500, so it’s almost nothing. In addition, our data storage solution is 20 times more dense than a lithium-ion battery and does not waste electricity,” explains founder and CEO Bjørn Brandtzaeg in an interview with TechCrunch. “This means we have a system where energy can be stored over time, allowing for seasonal storage. This is completely different from traditional batteries.”

Photoncycle uses water and electricity to produce hydrogen. This in itself will not be unusual in the event you follow fuel cell vehicle technology. However, the corporate’s approach includes an progressive twist: a reversible, high-temperature fuel cell. This advanced fuel cell can produce hydrogen and generate electricity in the identical device.

The core of Photoncycle’s innovation is hydrogen processing. They process hydrogen after which use technology to convert and store it in solid form. The company claims that this storage method will not be only secure due to the non-flammable and non-explosive nature of the solid, but in addition highly efficient. It enables the storage of hydrogen with a density roughly 50% greater than liquid hydrogen, which is a significant advance in hydrogen storage solutions. These innovations form the cornerstone of the Photoncycle system, facilitating the secure and dense storage of hydrogen, which the corporate says represents a huge breakthrough in energy technology.

Current clean energy solutions, equivalent to rooftop solar, are limited by inconsistent supplies due to the unpredictable nature of weather conditions. A sturdy reusable energy storage solution could overcome these schedules, ensuring a stable energy supply when these renewable sources encounter inevitable intermittent periods.

Great in theory, but not without its own challenges.

“The Netherlands is the country in Europe with the highest density of rooftop solar energy. We are currently seeing huge growth due to high energy prices; everyone wants rooftop solar,” Brandtzaeg says. However, he adds that this method can backfire for homeowners: “Last July within the Netherlands in the midst of the day it was 500 euros per megawatt hour to export electricity

Placing energy storage along with an energy-producing house effectively allows homes to be disconnected from the grid. Photoncycle says it has tested and worked with the core components of its solution – the following step is to integrate it into the system. The company says that if successful, it could seriously threaten Powerwall, Tesla’s lithium-ion battery solution.

David Gerez, CTO at Photoncycle, and Ole Laugerud, Photoncycle chemist, in Photoncycle’s purpose-built laboratory, which has been operating for nearly two years. Image credits: Photocycle

“It’s a relatively complex system – that’s why so many PhDs from different fields are working on it. The reason why Elon Musk said hydrogen is stupid is because you lose a lot of energy when you convert electricity into hydrogen and vice versa,” says Brandtzaeg. He believes his company can turn this bug into a feature. “In residential buildings, where 70% of energy demand is for heating, it is possible to use excess heat to provide hot water. We will focus on markets where people currently use natural gas for heating, and then we will replace the gas boiler in the home, using existing water infrastructure.”

Brandtzaeg’s confidence within the operational framework of the concept is convincing. He pointed to a small model of their operating facility within the labs, scaled down to the scale of a automobile battery. Brandtzaeg believes this scaling needs to be seamless and cites it because the primary reason they felt confident in implementing the project.

When it comes to providing power, hydrogen takes a while to generate electricity, so for buffering, the corporate relies on an intermediate, more conventional battery to balance the load. The company definitely attracts the eye of investors: Photocycle has just raised $5.3 million (€5 million) to construct the primary few energy storage devices in Denmark, which Photoncycle has chosen as its test market.

“Based on the interest, we could have raised 10 times more than we did. However, after this increase, I am still the majority owner,” says Brandtzaeg. “I wanted to maintain control of the company for as long as possible and not raise more capital than necessary to bring this service to market.”

This article was originally published on : techcrunch.com
Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

Fei-Fei Li chooses Google Cloud, where she led artificial intelligence, as the primary provider of computing solutions for World Labs

Published

on

By

Cloud service providers are chasing AI unicorns, and the latest is Fei-Fei Li’s World Labs. The startup just chosen Google Cloud as its primary computing provider for training artificial intelligence models, a move that could possibly be value a whole lot of hundreds of thousands of dollars. But the company said Li’s tenure as Google Cloud’s chief artificial intelligence scientist was irrelevant.

During the company’s Google Cloud Startup Summit on Tuesday announced World Labs will devote a big portion of its funds to licensing GPU servers on the Google Cloud Platform and ultimately to training “spatially intelligent” artificial intelligence models.

A handful of well-funded startups constructing basic AI models are in high demand in the world of cloud services. The largest deals include OpenAI, which exclusively trains and runs AI models on Microsoft Azure, and Anthropic, which uses AWS and Google Cloud. These corporations often pay hundreds of thousands of dollars for computing services, and sooner or later they could need much more as they scale their artificial intelligence models. This makes them beneficial customers for Google, Microsoft, and AWS to construct relationships with from the starting.

World Labs is definitely constructing unique, multimodal AI models with significant computational needs. The startup just raised $230 million at a valuation of over $1 billion, in a deal led by A16Z, to construct global artificial intelligence models. Google Cloud’s general manager of startups and AI, James Lee, tells TechCrunch that World Labs’ AI models will sooner or later have the ability to process, generate and interact with video and geospatial data. World Labs calls these AI models “spatial intelligence.”

Li has deep ties to Google Cloud, having led the company’s artificial intelligence efforts in 2018. However, Google denies that this deal is a result of this relationship and rejects the concept that cloud services are only a commodity. Instead, Lee said the greater factor is services, such as a high-performance toolkit for scaling AI workloads and a big supply of AI chips.

“Fei-Fei is obviously a friend of GCP,” Lee said in an interview. “GCP wasn’t the only option they were considering. But for all the reasons we talked about – our AI-optimized infrastructure and ability to meet their scalability needs – they ultimately came to us.”

Google Cloud offers AI startups a alternative between proprietary AI chips, tensor processing units or TPUs, and Nvidia GPUs, which Google buys and that are in additional limited supply. Google Cloud is attempting to persuade more startups to coach AI models on TPUs, mainly to cut back dependence on Nvidia. All cloud service providers today are limited by the shortage of Nvidia GPUs, so many are constructing their very own AI chips to satisfy demand. Google Cloud says some startups are training and inferring exclusively on TPUs, but GPUs remain the industry’s favorite AI training chips.

As part of this agreement, World Labs has chosen to coach its artificial intelligence models on GPUs. However, Google Cloud didn’t say what prompted this decision.

“We had been working with Fei-Fei and her product team, and at this point in the product roadmap it made more sense for them to work with us on the GPU platform,” Lee said in an interview. “But that doesn’t necessarily mean it’s a permanent decision… Sometimes (startups) move to other platforms like TPU.”

Lee didn’t reveal how large World Labs’ GPU cluster is, but cloud providers often devote huge supercomputers to startups training artificial intelligence models. Google Cloud promised one other startup training basic AI models, Magic, a cluster with “tens of thousands of Blackwell GPUs,” each with more power than a high-end gaming PC.

These clusters are easier to vow than to deliver. According to reports, Microsoft is a competitor to Google’s cloud services struggles to satisfy insane computational demands OpenAI, forcing the startup to make use of other computing power options.

World Labs’ contract with Google Cloud is non-exclusive, which suggests the startup can still strike deals with other cloud service providers. Google Cloud, nevertheless, says most of its operations will proceed.

This article was originally published on : techcrunch.com
Continue Reading

Technology

The undeniable connection between politics and technology

Published

on

By

April Walker, contributors network, technology, AI, Artificial intelligence


Written by April Walker

Politics and Technology: Undeniable Union

In the trendy era, the intersection of politics and technology has change into an undeniable unity, shaping the way in which all societies function and the course of political processes. The rapid advances in technology haven’t only transformed communication and the dissemination of knowledge, but in addition redefined political engagement, campaigning and management. Given the political climate we face today, combined with the sphere of artificial intelligence and the countless communication platforms at our disposal, let’s explore the profound impact of technology on politics, specializing in the digitization of grassroots movements, the threats posed by hackers and deepfakes, the fight social media with traditional media and the long run of post-election politics.

Grassroots has gone digital

Historically, grassroots movements have relied on face-to-face interactions, community meetings, and physical rallies to mobilize support and drive change. However, the appearance of digital technology has revolutionized these movements, enabling them to achieve wider audiences with unprecedented speed and efficiency. Social media platforms, email campaigns and online petitions have change into powerful tools for local organizers. These digital tools enable the rapid dissemination of knowledge, real-time updates, and the flexibility to mobilize supporters across geographic boundaries. An example is that the 2024 presidential election saw unprecedented use of tools like Zoom to unite communities in support of United States Vice President and presidential candidate Kamala Harris to lift multimillion-dollar donations for her campaign. Demonstrating a robust fundraising forum, Zoom calls that began with black women beginning to support “white dudes” and every little thing in between, now we have witnessed how in a really short time frame these communities haven’t only raised an enormous sum of money, but in addition helped empower her message and consistently refuting the disinformation and lies spread by her opponent.

This shouldn’t be the primary time that the facility of social media in organizing protests and gathering international support has been demonstrated, for instance throughout the Arab Spring in 2010–2011. Platforms similar to Twitter (today referred to as X) and Facebook played a key role in coordinating demonstrations and sharing real-time updates, which ultimately contributed to significant political changes within the region. Similarly, contemporary movements similar to Black Lives Matter have harnessed the facility of digital technology to amplify their message, organize protests, and raise awareness on a worldwide scale.

Hackers and Deepfakes

While technology has empowered political movements, it has also introduced latest threats to the integrity of political processes. Hackers and deepfakes pose two significant challenges on this regard. Cyberattacks on political parties, government institutions and electoral infrastructure have gotten more common, posing a threat to the safety and integrity of elections. In 2016, the Russian hack of the Democratic National Committee (DNC) highlighted the vulnerability of political organizations to cyber threats. Our current electoral process poses a fair more direct and immediate threat. Advances in artificial intelligence and the sheer proliferation of its use have created an increasingly sophisticated and complex network of “bad actors”, especially from other countries, who’re using the technology to attempt to influence the end result of the US presidential election.

Deepfakes, manipulated videos or audio recordings that appear authentic, present one other disturbing challenge. These sophisticated falsehoods can spread disinformation, discredit political opponents and manipulate public opinion. Just have a look at the recent use of AI-generated photos of music superstar Taylor Swift, who falsely claimed to support Republican presidential candidate Donald Trump when, in reality, Taylor publicly expressed her support for Kamala Harris. There is growing concern concerning the possibility that deepfakes could undermine trust in political leaders and institutions. As technology continues to advance, the flexibility to detect and address these threats becomes crucial to maintaining the integrity of democratic processes.

Social media versus traditional media

The development of social media has fundamentally modified the landscape of political communication. Traditional media similar to newspapers, television and radio used to have a monopoly on the dissemination of knowledge. However, social media platforms do democratized the flow of knowledge, enabling individuals to share newsFeedback and updates immediately. This change has each positive and negative policy implications.

The positive side of social media is that it allows politicians to speak directly with the general public, promoting transparency and consistent engagement. Politicians can use platforms like Instagram, TikTok, LinkedIn, Facebook and others to share their views, reply to voters and mobilize support, and in lots of cases, market-specific demographics which can be critical to closing the gap in areas requiring coordinated focus and attention. However, the unregulated nature of social media also enables the spread of disinformation, fake news and “echo chambers” where individuals are only exposed to information that reinforces their existing beliefs. If you are curious if that is true, take a have a look at the varied messages you read on platforms supporting each political parties – it’s amazing how starkly different these narratives will be.

With unwavering editorial standards and robust fact-checking processes, traditional media continues to play a key role in providing trusted information. However, it faces challenges in competing with the speed and global reach of social media. The coexistence of those two types of media creates a posh information ecosystem that requires critical considering, intentional skepticism and media literacy from society.

What’s next after the elections?

As we approach the upcoming Presidential Election Day, and for that matter even after that day, attention will shift to managing and implementing campaign guarantees. Technology continues to play a key role at this stage, with governments using digital tools to make sure effective administration, transparency and citizen engagement.

In my opinion, the post-election period and current policies could have to face the challenges posed by disinformation and cyber threats. Governments and organizations must comprehensively put money into cybersecurity measures, digital skills programs and regulatory frameworks to guard the integrity of political processes. As technology evolves at an especially rapid pace, the long run of politics will likely see a continued integration of its impact, emphasizing balancing its advantages with the necessity to protect democratic values, institutions and, most significantly, public trust! That said, we can’t be afraid of technology; we must seize this because innovations like artificial intelligence and GenAI are creating competitive opportunities for our country which can be unimaginably powerful and have the potential to positively change the course of humanity. During the Democratic National Convention, Vice President Kamala Harris noted during her ticket acceptance speech: “Once elected, I’ll make certain that we lead the world right into a future powered by space and artificial intelligence, that America, not China, wins elections in a twenty first century competition, and that we strengthen , and we are usually not giving up our global leadership.”


April Walker, Author Network, Technology


This article was originally published on : www.blackenterprise.com
Continue Reading

Technology

Alaska Airlines venture capital lab creates its first startup: Odysee

Published

on

By

Odysee CEO Steve Casley sees dollar signs in the information. More specifically, artificial intelligence-based software that may analyze vast amounts of information to assist industrial airlines profit from complex flight schedules.

That’s exactly what Odysee, the first startup born from the aviation-focused research lab created by Alaska Airlines and UP.Labs, is doing. The two corporations launched a venture lab last 12 months to create startups designed to resolve specific problems in air travel, reminiscent of guest experience, operational efficiency, aircraft maintenance, routing and revenue management. Odysee said it raised $5 million in a pre-seed round led by UP.Partners, a Los Angeles-based VC firm affiliated with UP.Labs. Alaska Star Ventures, which launched in October 2021, has invested $15 million in UP.Partners’ inaugural early-stage fund.

According to Casley, Alaska Airlines CEO Ben Minicucci flagged a scheduling problem early on. And no wonder. While there may be software available to investigate flight data and plan flights, Casley says they lack the type of real-time data – and, most significantly, revenue forecasts – that Odysee produces.

“You need tools to make better decisions because typically in airlines, schedule changes are made by experienced planners who do it intuitively,” Casley said in a recent interview. “I would not say what their seat within the pants is because numerous the time they will likely be right because they’ve seen each bad and good changes. But they never actually had the information to support their decisions.

According to the corporate, the data-enabled software can run a whole lot of simulations in seconds to quickly determine how schedule changes could impact revenue, profits and reliability.

“There are other optimizers, but to my knowledge none of these models or any optimization company offers revenue forecasts,” Casley said.

The machine learning model built by Odysee accommodates roughly 42 attributes, which include every part from departure time and day to traffic volumes on a selected route and competition schedules. The startup present in early simulations that it was able to save lots of a whole lot of hundreds of dollars in Alaska with only one schedule change.

Odysee is currently conducting user acceptance testing in Alaska. Once all that is accomplished, Alaska will begin a trial period of the software, which Casley said will begin in late October.

That’s a brief timeframe, considering UP.LAbs and Alaska Airlines only established the flight lab a 12 months ago. A fast path to industrial products is one among the major benefits of UP.Labs. UP.Labs, which launched for the first time in 2022, is structured as a venture lab with a brand new variety of financial investment vehicle. The company partners with large corporations reminiscent of Porsche, Alaska Airlines, and most recently JB Hunt to launch startups with recent business models aimed toward solving the industry’s biggest problems. Each partnership will create six startups over three years.

Under the UP.Labs structure, these startups is not going to be created solely to serve a company partner – on this case, Alaska Airlines. Rather, they may operate independently and as industrial enterprises from the outset, ultimately generating revenue from the sale of services or products throughout the industry.

This article was originally published on : techcrunch.com
Continue Reading
Advertisement

OUR NEWSLETTER

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Trending