Connect with us

Technology

Humanoid robots learn to fall well

Published

on

Smart marketers at Boston Dynamics produced two major robotics news series last week. The larger of them was, after all, an announcement of the electrical Atlas. As I write this, the sub-40-second video is steadily approaching five million views. The day before, the corporate touched the hearts of the community by announcing that the unique Hydraulic Atlas can be put out to pasture ten years after its launch.

An accompanying video celebrated the older Atlas’ journey from a DARPA research project to an impressively agile bipedal “bot.” But after a minute, the tone changes. Ultimately, Farewell to Atlas is each a celebration and a bummer. It’s a pleasant reminder that each time a robot captures a landing on video, there are dozens of slips, falls and snorts.

Image credits: Boston dynamics

Advertisement

I actually have long advocated for this sort of transparency. This is something I would really like to see more of on the planet of robotics. Just showing a highlight reel hurts the hassle that went into making those shots. In many cases, we’re talking about years of trial and error that led to robots looking good on camera. By sharing only positive results, you set unrealistic expectations. Bipedal robots fall over. At least on this respect they’re the identical as us. As agility put it down recently“Everyone falls sometimes. How we rise up defines us.” I’d go a step further and add that learning to fall well is equally necessary.

The company’s newly appointed CTO, Pras Velagapudi, recently told me that seeing robots get to work at this stage is definitely a great thing. “When a robot is actually in the world doing real things, unexpected things will happen,” he notes. “You’re going to see some falls, but that’s part of learning to run for a really long time in real-world conditions. This is to be expected and is a sign that you are not staging anything.”

A fast review of Harvard’s policies for a fall without injury reflects what we intuitively understand concerning the fall as humans:

  1. Protect your head
  2. Use your weight to guide your fall
  3. bend your knees
  4. Avoid taking other individuals with you

As for robots, yes Excerpt from last yr’s IEEE Spectrum is an amazing place to start.

“We’re not afraid of failure — we don’t treat robots as if they’re going to break down all the time,” Boston Dynamics chief technology officer Aaron Saunders said last yr. “Our robot falls a lot, and one of the things we decided a long time ago is that we need to build robots that can fall without breaking. If you can go through the cycle of bringing the robot to failure, investigating the failure, and repairing it, you can make progress to the point where the robot does not fall. But if you build a machine, control system, or culture that never falls, you will never learn what you need to learn to keep the robot from falling. We celebrate falls, even those that break the robot.”

Advertisement

Image credits: Boston dynamics

The topic of decline also got here up after I spoke with Boston Dynamics CEO Robert Playter ahead of the launch of the electrical Atlas. It’s value noting that the short video starts with the robot in a prone position. The way the robot’s legs arch is kind of novel, because it allows the system to get up from a totally flat position. At first glance, you would possibly get the impression that the corporate is showing off through the use of this flashy movement simply to showcase its extremely durable, custom-made actuators.

“It will be a very practical application,” Playter told me. “Robots will fall. You higher have the option to stand up from a lying position. He adds that the power to stand up from a lying position may also be useful for charging.

Much of Boston Dynamics’ knowledge about falls comes from Spot. While the four-legged chassis is usually more stable (as evidenced by many years of trying to kick robots in movies and failing), Spot Robots are simply lots more workable in real-world conditions.

Advertisement

Image credits: Agile robotics

“The Spot travels approximately 70,000 km per year around production halls and conducts approximately 100,000 inspections per month,” adds Playter. “Eventually they fall. You have to be able to get up. We hope you slow down your rate of descent – we do. I think we fall once every 100-200 km. The rate of decline is really slow, but it does happen.”

Playter adds that the corporate has a protracted history of being “harsh” with its robots. “They are falling and so they have to survive. Fingers cannot fall off.

Watching the Atlas videos above, it’s hard not to transfer a little bit of human empathy to the “bot”. It really does feel prefer it’s falling like a human, pulling its limbs as close to its body as possible to protect them from further injury.

Advertisement

When Agility added arms to Digit in 2019, they discussed the role they played in falls. “For us, arms are at once a tool for navigating the world – think of getting up after a fall, swinging your arms for balance, or opening doors – while also being useful for manipulating or moving objects” – co-founder Jonathan Hurst then noticed.

I talked a bit about this with Agility at Modex earlier this yr. A video circulated on social media of the Digit robot collapsing on the convention floor the yr before. “With a 99% success rate in approximately 20 hours of live demonstration, Digit still had a few downs in ProMat,” Agility noted on the time. “We have no evidence, but we believe our sales team arranged this so they could talk about Digit’s quick-change limbs and durability.”

Advertisement

As with the Atlas video, the corporate told me that a fetal-like position is helpful for shielding the robot’s legs and arms.

The company uses reinforcement learning to help fallen robots right themselves. Agility disabled Digit’s obstacle avoidance for the video above to force a fall. In the video, the robot uses its arms to break its fall as much as possible. He then uses the strengthening knowledge he has gained to return to a well-recognized position from which he can stand again by performing an automatic push-up.

One of the foremost benefits of humanoid robots is their ability to integrate into existing workflows – these factories and warehouses are called “brownfields”, meaning they weren’t custom-built with automation in mind. In many existing factory automation cases, errors mean the system effectively shuts down until human intervention occurs.

“Rescuing a humanoid robot won’t be easy,” Playter says, noting that these systems are heavy and may be difficult to arrange manually. “How are you going to do that if he can’t get off the ground?”

Advertisement

If these systems are truly to provide uninterrupted automation, they are going to have to fail and rise quickly.

“Every time Digit goes down, we learn something new,” adds Velagapudi. “When it comes to two-legged robotics, falling is a great teacher.”


Advertisement
This article was originally published on : techcrunch.com
Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

The planned Openai data center in Abu Dhabi would be greater than Monaco

Published

on

By

Sam Altman, CEO of OpenAI

Opeli is able to help in developing a surprising campus of the 5-gigawatt data center in Abu Dhabi, positioning the corporate because the fundamental tenant of anchor in what can grow to be considered one of the biggest AI infrastructure projects in the world, in accordance with the brand new Bloomberg report.

Apparently, the thing would include a tremendous 10 square miles and consumed power balancing five nuclear reactors, overshadowing the prevailing AI infrastructure announced by OpenAI or its competitors. (Opeli has not yet asked TechCrunch’s request for comment, but in order to be larger than Monaco in retrospect.)

The ZAA project, developed in cooperation with the G42-Konglomerate with headquarters in Abu Zabi- is an element of the ambitious Stargate OpenAI project, Joint Venture announced in January, where in January could see mass data centers around the globe supplied with the event of AI.

Advertisement

While the primary Stargate campus in the United States – already in Abilene in Texas – is to realize 1.2 gigawatts, this counterpart from the Middle East will be more than 4 times.

The project appears among the many wider AI between the USA and Zea, which were a few years old, and annoyed some legislators.

OpenAI reports from ZAA come from 2023 Partnership With G42, the pursuit of AI adoption in the Middle East. During the conversation earlier in Abu Dhabi, the final director of Opeli, Altman himself, praised Zea, saying: “He spoke about artificial intelligence Because it was cool before. “

As in the case of a big a part of the AI ​​world, these relationships are … complicated. Established in 2018, G42 is chaired by Szejk Tahnoon Bin Zayed Al Nahyan, the national security advisor of ZAA and the younger brother of this country. His embrace by OpenAI raised concerns at the top of 2023 amongst American officials who were afraid that G42 could enable the Chinese government access advanced American technology.

Advertisement

These fears focused on “G42”Active relationships“With Blalisted entities, including Huawei and Beijing Genomics Institute, in addition to those related to people related to Chinese intelligence efforts.

After pressure from American legislators, CEO G42 told Bloomberg At the start of 2024, the corporate modified its strategy, saying: “All our Chinese investments that were previously collected. For this reason, of course, we no longer need any physical presence in China.”

Shortly afterwards, Microsoft – the fundamental shareholder of Opeli together with his own wider interests in the region – announced an investment of $ 1.5 billion in G42, and its president Brad Smith joined the board of G42.

(Tagstransate) Abu dhabi

Advertisement
This article was originally published on : techcrunch.com
Continue Reading

Technology

Redpoint collects USD 650 million 3 years after the last large fund at an early stage

Published

on

By

Redpoint Ventures, an organization based in San Francisco, which is a few quarter of a century, collected $ 650 million at an early stage, in keeping with A regulatory notification.

The latest RedPoint fund corresponds to the size of its previous fund, which was collected barely lower than three years ago. On the market where many enterprises reduce their capital allegations, this cohesion may indicate that limited partners are relatively satisfied with its results.

The company’s early stage strategy is managed by 4 managing partners: Alex Bard (pictured above), Satish Dharmraraj, Annie Kadavy and Eric Brescia, who joined the company in 2021 after he served as the operational director of Githuba for nearly three years.

Advertisement

The last outstanding investments of the RedPoint team at an early stage include AI Coding Pool Pool, which was founded by the former partner Redpoint and CTO GitHub Jason Warner, distributed laboratories of SQL database programmers and Platform Management Platform Platform Levelpath.

A multi -stage company also conducts a development strategy led by Logan Barlett, Jacob Effron, Elliot Geidt and Scott Raney partners. Last 12 months, Redpoint raised its fifth growth fund at USD 740 million, which is a small increase in the USD 725 million fund closed three years earlier.

The recent RedPoint outputs include the next insurance, which was sold for $ 2.6 billion in March, Tastemada Startup Media Travel -utar -Media was enriched by Wonder for $ 90 million, and the takeover of Hashicorp $ 6.4 billion by IBM.

Redpoint didn’t answer the request for comment.

Advertisement

(Tagstranslate) Early Stage Venture Capital (T) Basenside (T) Redpoint Venture Partners

This article was originally published on : techcrunch.com
Continue Reading

Technology

Tensor9 helps suppliers implement software in any environment using digital twins

Published

on

By

Enterprises must access latest software and artificial intelligence tools, but they’ll not risk sending their sensitive data to external software suppliers as a service (SAAS). Tensor9 He tries to help software firms to get more corporate customers, helping them implement the software directly in the client’s technological stack.

TENSOR9 transforms the software supplier code into the format needed to implement their client in the technological environment. Tensor9 then creates a digital twin of implemented software or a miniaturized infrastructure model of implemented software, so TENSOR9 customers can monitor how the software works in their customer environment. TENSOR9 will help firms to be placed in any premise, from the cloud to a bare server.

Michael Ten-POW, co-founder and general director of TENSOR9, told Techcrunch that the pliability to tendsor9 to send software to any assumption and using digital double technology in order to help in distant monitoring, helps to face out from other firms, comparable to Octopus implementation or non, which also help firms implement software in the client’s environment.

Advertisement

“You can’t just throw the wall software, or it is very difficult to throw the wall software and know what is happening, be able to find problems, debrieve them, fix them,” said Ten-POW (in the photo above, on the left). “They see how it works, they can debate it, can log in and understand what problems are and fix them.”

He said that time is suitable for Tensor9 technology on account of the wind from the creation of AI. Companies and financial institutions wish to simply accept AI technology, but they’ll not risk sending their data to third parties.

“Enterprise search seller can succeed, say, JP Morgan and say:” Hey, I’d love access to your entire six data parabetts to construct an intelligent search layer in order that your internal employees can confer with the company’s given company, “it is not possible to work,” said this-POW.

Ten-Pow, a former engineer in AWS, said he had a “long, quite winding path” to run the tensor9. He came up with the company’s idea, working on one other potential concept that failed. He spent some time, wondering if he would discover an answer to make it easier for software suppliers to accumulate a SOC 2 certificate, a frame compliance frame to help them unlock customers who required their suppliers.

Advertisement

Although it failed, he discovered from clients’ connections that what firms really wanted was software to act in their very own technological environment. But many programming firms, especially startups, shouldn’t have any resources to provide a specially to order for each company customer.

This sentiment became the premise of Tensor9, which Ten-POW began in 2024. Later this 12 months he brought two of his former colleagues, Matthew Michie and Matthew Shanker, as co-founders.

The company found early grip with AI. Since then, they began to expand to work in other industries, including: attempting to get your hands on enterprises, corporate databases and data management. The company currently cooperates with AI, including: 11x, REELL AI and DYNA AI.

TENSOR9 BootstrePPRE for the first 12 months, and recently raised a round of $ 4 million, led by Wing VC with the participation of UP Ventures levels, Devang Sachdev with the Ventures model, Nvangels, Angelic group of former employees of NVIDIA and other Investors of Angels. This POW said that the involvement of investors with this idea was not too difficult, for the rationale that VC they talked to see how their portfolio firms struggle with this exact problem. Tensor9 simply needed to steer investors that they were an appropriate team for work.

Advertisement

“We have a simple model, but there are many complexities under the covers, which makes it happen, difficult technical challenges that we solved to make it happen,” said Ten-Pow. “I think it was one of the things that helped us convince investors to invest in us.”

The company plans to utilize funds for employment and construct one other generation of its technology in order that it could cooperate with clients in larger vertical number.

“There was evolution from (on the premise) to the cloud and we think that this idea of ​​the software lives where it must and works where it must, is the next step, which is a kind of synthesis of previous local and cloud ideas,” Ten-POW said.

(Tagstranslate) artificial intelligence

Advertisement
This article was originally published on : techcrunch.com
Continue Reading
Advertisement

OUR NEWSLETTER

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Trending