Connect with us

Technology

Ford is lowering the price of BlueCruise’s hands-free driving feature

Published

on

Ford cuts price of BlueCruise hands-free driving feature

In response to customer and dealer feedback, Ford is lowering each the monthly and annual cost of its hands-free driver assistance feature, BlueCruise, for brand new and existing owners, TechCrunch reports.

Car manufacturer announced on Tuesday that it would now charge $49.99 a month or $495 a 12 months for BlueCruise, which allows drivers to take their hands off the wheel on pre-designated highways across the United States. This is down from the previous price of $75 per 30 days or $800 per 12 months.

Ford is also now offering a “one-time purchase” option for BlueCruise. Buyers should buy the BlueCruise for $2,495 when ordering the vehicle recent, and Ford guarantees it would last for no less than seven years. The company says owners won’t must pay one other cent after seven years “if the service is available.” Owners cannot transfer their BlueCruise subscription to a different vehicle.

The price drop comes as BlueCruise is currently under federal investigation following two fatal crashes that occurred earlier this 12 months while the feature was lively. The driver involved in a single of these accidents was recently charged with manslaughter under the influence of alcohol.

Announced in 2021, BlueCruise uses a camera-based driver monitoring system to examine whether drivers are watching the road when the system is lively. The company wouldn’t disclose what percentage of owners activated this feature. Ford’s price change to BlueCruise also comes a day after the company switched announced offers a free home charger and installation coverage to extend the adoption of electric vehicles.

This article was originally published on : techcrunch.com
Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Technology

Anduril is accelerating the launch of defensive payloads by purchasing ready-made Apex satellite buses

Published

on

By

On the sensor expands even further to the “top level”.

The company, best known for its AI-powered defense products in the air, on land and at sea, is partnering with a satellite bus startup Apical space for the rapid deployment of payloads into orbit for the US Department of Defense.

This is a rare case where an emerging defense contractor decides to partner with a supplier somewhat than construct the product itself or just acquire the supplier outright. But this partnership is smart: Anduril attributes much of its success to its approach to product design and development, which emphasizes rapidly developing large numbers of products using off-the-shelf components to cut back costs. Apex does something similar by producing satellite buses, the part of the spacecraft that holds the payload. In the past, they were subject to individual engineering processes, long lead times and high prices.

“We’re really focused on recreating the same things we’ve done in other areas, in the space domain,” Gokul Subramanian, Anduril’s vp of space and software, said at a press conference. “If you concentrate on what Anduril has done successfully in sea, air and ground transportation, there is a shift from the low-volume, high-cost systems which have traditionally been used to high-volume, low-cost systems. We have the same belief in space – that to achieve success in space, we want to maneuver to high-volume, low-cost production.”

Ian Cinnamon, co-founder and CEO of Apex Space, said the satellite bus is the “biggest bottleneck” in the space ecosystem, stopping America from putting more mass into orbit. Their goal is to deliver satellite buses to customers in weeks, not years, with more transparent pricing and a standardized product.

The Anduril-built payload flew in March on the first-ever Apex mission, which Subramanian called a “mission data processor” that allows on-orbit processing of images captured from the satellite. This payload leverages Lattice, the command and control process implemented in all Anduril products. In summary, Anduril was in a position to reveal the ability to point a spacecraft to a particular location, take an image of what the spacecraft saw, process that image, and transmit the data to Earth – all completely autonomously.

“It was the first experiment that gave us confidence in our vision for space, our collaboration with Ian and the bus platform they built,” he said.

Anduril has already purchased a dedicated satellite bus from Apex, which can be launched next yr. Anduril will operate this method, which can carry payloads built in-house and by others. This can be the model of the future, the pair of executives explained: Apex will provide the buses, Anduril will “mission the system,” Subramanian said.

Subramanian declined to comment on the specific opportunities the company hopes to pursue with the latest partnership, however it leaves the company in a great position to tackle a main contractor role on some coveted contracts. For example, the Space Development Agency’s Proliferated Warfighter Space Architecture program is deploying masses of satellites to enhance the Space Force’s aging missile tracking and defense architecture. SDA spends huge amounts of money on these satellites; So far, contracts for the construction of satellites under the program have been awarded to, amongst others: Sierra Space, Rocket Lab, SpaceX. Anduril undoubtedly hopes to affix the club.

This is not Anduril’s first foray into space: in July 2023, the company won a $10.50 contract from Space Systems Command to include Lattice into Space Surveillance Network (SSN) sensors, used for early warning of missiles. Last week, the company was also awarded a $25.3 million contract from the Space Force to offer additional ANN upgrades.

This is the first of many partnerships Anduril intends to announce, including with other bus providers, Subramanian added.

This article was originally published on : techcrunch.com
Continue Reading

Technology

A black MIT graduate has created a dance nonprofit for diversity in STEM

Published

on

By

Yamileé Toussaint, Stem from dance, stem, MIT

The nonprofit organization promotes dance to make STEM education more accessible and exciting for diverse students, especially young Black girls.


A Black MIT graduate is using her love of dance to encourage diverse girls to explore science, technology, engineering and math (STEM).

Growing up on Long Island, New York, Yamileé Toussaint developed wide-ranging interests beyond STEM. Since her parents studied medicine and engineering, Toussaint’s family encouraged her to dance while pursuing an education. While at MIT, CNN Hero 2024 he even led a dance group.

“It has always been a source of community, perseverance and learning determination,” she told CNN.

However, Toussaint immediately noticed that few black women like her pursued this path. She only saw two of them, including her, studying mechanical engineering at university. The feeling of isolation never left Toussaint, which led her to initiate change in her outside community in New York.

“What struck me most was that I didn’t feel so special, that I should be one of two people,” she explained. “I felt like it should be different and it could be different… I just started thinking about a world where the benefits of dance can lead to the outcomes we are looking for in STEM.”

After becoming a teacher in East Brooklyn in 2008, Toussaint created STEM From Dance 4 years later. This nonprofit organization promotes dance in STEM education more available and exciting for diverse students, especially young black girls.

The program includes school and summer programs that reach dancers who’re less fascinated by math and science. However, the teachings work cohesively, with participants using STEM projects as they choreograph their moves. By learning find out how to create code that may work for the LED strips that illuminate the dance floor, the ladies realize how science can improve their performances.

The program has expanded to nine cities in the US. Toussaint emphasized that the mission goes beyond filling representation gaps in STEM. They also hope to remind young girls that they’ll do things that intimidate them.

“Through dance, we are able to create an atmosphere in which we feel comfortable,” Toussaint said. “With this space, we’re able to introduce something that seems a little intimidating… So when they’re faced with a difficult math problem, they’re reminded, ‘I can do hard things.’”

According to the Equal Employment Opportunity Commission, only 14.58% of girls are in STEM identified as Black or African American people in 2019. Toussaint and STEM From Dance, while still underrepresented in the sector, hope to remove this barrier for the following generation.

“I believe that the solution to some of the world’s most pressing problems lies in having these girls in the room because they have a different set of life experiences,” Toussaint said. “They are creative, intellectual, curious, artistic and will bring a different set of ideas to the discussion, so we need to make sure those are included.”

However, hopes for expansion are still alive. Fundraising on GoFundMe to the master these dance enthusiasts and future STEM professionals proceed.

Post views: 89


This article was originally published on : www.blackenterprise.com
Continue Reading

Technology

Meta won’t say whether it’s training AI on photos wearing smart glasses

Published

on

By

Meta

The AI-powered Meta Ray-Bans have a discreet front-facing camera that allows you to take photos not only whenever you ask, but in addition when the AI ​​features trigger them with specific keywords, corresponding to “see.” This signifies that smart glasses collect a variety of photos, each intentionally and unintentionally taken. But the corporate is not going to commit to keeping these photos private.

We asked Meta if it plans to coach artificial intelligence models on photos of Meta’s Ray-Ban users, because it does with photos from public social media accounts. The company would not say.

“We don’t discuss this publicly,” Anuj Kumar, a senior director working on AI-powered wearables at Meta, said in a video interview with TechCrunch on Monday.

“We don’t normally share this externally,” said Meta spokeswoman Mimi Huggins, who also participated within the video call. When TechCrunch asked for clarification on whether Meta was training on these images, Huggins replied: “We’re not saying otherwise.”

This is especially concerning because, amongst other things, Ray-Ban Meta has a brand new artificial intelligence feature that captures many passive photos. Last week, TechCrunch reported that Meta plans to launch a brand new real-time video feature for Ray-Ban Meta. When activated with specific keywords, the smart glasses will stream a series of images (essentially live video) to the multimodal AI model, enabling it to reply questions on its surroundings in a low-latency and natural manner.

That’s a variety of photos, they usually are photos that a Ray-Ban Meta user will not be aware of taking. Let’s say you asked your smart glasses to scan the contents of your closet to enable you to select an outfit. The glasses effectively take dozens of photos of your room and all the pieces in it and upload all of them to an AI model within the cloud.

What then happens to those photos? The meta won’t tell.

Wearing Ray-Ban Meta glasses also signifies that you might be wearing a camera on your face. As we discovered with Google Glass, it’s not something other people universally agree on, to say the least. So you would possibly think that it will be obvious for an organization that does this to say, “Hey! All your face camera photos and videos will likely be completely private and hidden in face camera.

But that is not what Meta is doing here.

Meta has already declared that it’s training its AI models every American’s public posts on Instagram and Facebook. The company has decided that that is “publicly available data” and we can have to simply accept it. It and other tech corporations have adopted a really expansive definition of what’s publicly available for them to coach AI and what will not be.

But the world you have a look at through smart glasses is definitely not “publicly accessible.” While we won’t say obviously that Meta is training its AI models on Meta’s Ray-Ban camera footage, the corporate simply would not say with certainty that it is not.

Other AI model providers have more transparent policies regarding training on user data. Anthropic says it he never trains customer input or output from one in every of its AI models. OpenAI also says this he never trains user input or output via API.

We have reached out to Meta for further clarification and can update this story if we’re contacted.

This article was originally published on : techcrunch.com
Continue Reading
Advertisement

OUR NEWSLETTER

Subscribe Us To Receive Our Latest News Directly In Your Inbox!

We don’t spam! Read our privacy policy for more info.

Trending