Menu Close

Lidar and AI are the future of iPhones

Lidar and AI are the future of iPhones, experts say

Experts say technologies like Lidar and AI are the future of iPhones as they turn fifteen this year. Lucy Edwards, a blind British journalist and presenter struggled to maintain a social distance in public. This was during the height of the pandemic.

So, she checked out the iPhone’s People Detection feature. This uses the lidar sensor in the iPhone 12 Pro and 13 Pro. She uses it to check if there are other people nearby.  Edwards stated, “I’m going to have to get used to it. However, I’m really pleased that I can be in control again.” This is when she was asked  about how she felt about her newfound power over her life.

One example of how iPhone technology has progressed in the previous 15 years is Lidar, or light detection and range. A 3.5-inch screen and a 2-megapixel camera were all that the original iPhone had. This is when it was unveiled on June 29, 2007.

Apple’s newest iPhones have three cameras on the back that can shoot movies. Also, with sensors that help people like Edwards find their way around and powerful CPUs with billions of transistors.

In many ways, the iPhone has been a driving force in the growth of our mobile lives. This is from the digital assistant Siri to mobile payments to wireless charging.

However, everything that surrounds the iPhone may become its most significant component in the future. Those are the findings of analysts who have studied the overall trends in the mobile sector and Apple’s approach.

Lidar and AI are the future of iPhones

Higher-quality cameras and larger screens are expected to be on the horizon in the near future. Apple’s iPhone might become a central point for smart glasses and other wearable tech during the next decade.

Apple Watches, AirPods and CarPlay-enabled automobiles may be just the beginning. It is thought that the iPhone’s key components including its display and charging technologies will receive a major upgrade.

Runar Bjrhovde, an analyst with market research firm Canalys, stated, “The next quest for the smartphone is to figure out what it will link to next.” “I think the smartphone is getting close to the edge of what it can do as a stand-alone device. This is because it hasn’t reached its full potential yet.”

Everything is controlled by your iPhone.

A lot of people are wondering what comes after the smartphone. This appears to be the overwhelming consensus: smart glasses.

Bloomberg predicts that Apple might release mixed reality headsets this year or next that support augmented and virtual reality technology. According to the paper, smart glasses powered by augmented reality might be available by the end of this year.

Was there some connection here with the iPhone? Everything is a possibility. This headset is intended to work on its own. However, the apps and services it uses are likely to be linked to Apple’s iPhone.

The Apple Watch comes to Mind

The ability to synchronize with Apple’s phone is a big part of its attraction. This is even though it doesn’t require an iPhone to work. Notifications on the Apple Watch are often linked to the iPhone’s accounts and applications.

Apple’s Watch, AirPods and other HomeKit-enabled gadgets are all expected to keep the phone in the spotlight.

One of Loup Ventures’ managing partners, Gene Munster, an Apple analyst said that “the phone is going to be the cornerstone” for the company.

Connecting personal technology devices is only one part of the equation. Our lives are getting more and more intertwined with Apple’s iPhone. It is also becoming a better option for our wallets.  The company has made great strides in this area in recent months. This is with new capabilities like digital IDs and Tap to Pay that allow shops to accept contactless payments without extra hardware.

Apple Pay Later, which lets people pay for purchases in four equal payments over a six-week period was also just released by the company. “There is a lot of momentum in Apple’s financial services and I think we’ll see additional developments there,” said Nick Maynard, head of research for Juniper Research.

In order to improve spatial awareness, we need stronger Lidar and AI. Accurately predicting Apple’s long-term plans for the iPhone is far more difficult than making educated assumptions about the individual modifications that may be on the way. However, Apple’s current iPhones have sown the seeds for some speculation.

Lidar Sensors

Using Lidar is expected to remain a key component of the company’s augmented reality efforts. Apple will use Lidar in the iPhone 12 Pro to improve AR app performance. Also, to enable new camera tricks and support accessibility capabilities like the foregoing People Detection.

Distance is determined by the amount of time it takes for light to bounce off an object and return. But Munster thinks that the Lidar sensors in the iPhone may not be advanced enough to help Apple achieve its goals for augmented reality.

Read: Seattle City moves toward zero emission – will install free EV charger on pole

Virtual and augmented reality research company Munster says that “specifically what has to happen is that the mapping of the real world needs to be more precise,” he stated. Because “AR isn’t going to happen unless that does.”

Even though the iPhone’s depth detecting capabilities have been enhanced thanks to the addition of Lidar, the CPU is still responsible for deciphering all of that data. In order to provide the iPhone and other devices more context about people and their surroundings, Apple has turned to artificial intelligence (AI), one of Silicon Valley’s favorite buzzwords in recent years.

You can see this strategy in action on the Apple Watch once again. Apple’s wristwatch employs artificial intelligence and data obtained from its sensors to perform functions such as tracking your sleep and recognizing when you’re washing your hands, among others.

Counterpoint Research senior analyst Hanish Bhatia offered a hypothetical illustration of how AI advancements could appear in future iPhones. Eventually, Apple’s smartphones will be able to track a user’s activities and determine whether or not they are the principal user or a member of their family.

Lidar and Ultra Wideband

Is your phone being used in a certain way, or do you merely touch it with your nails or anything like that? “What do you use your phone for?” because he was making a point. As a user, you’ll exhibit a wide range of distinct behaviors.

According to Bhatia’s example, Apple does not have anything like that in mind. However, it’s possible to picture a future like this with AI developments and technologies like Lidar and ultra-wideband providing the iPhone with additional spatial awareness.

Displays and charging systems may undergo significant revisions in the near future. A foldable iPhone may be one of the most pressing issues regarding Apple’s future smartphone intentions.

Samsung, which is Apple’s biggest rival in the smartphone market, has already made a few models with foldable screens. A flexible Pixel is reported to be in the works by Google, Motorola, Huawei and Microsoft. According to the International Data Corporation, the number of foldable smartphones shipped in 2021 will increase by 264.3% over 2020, according to TIDCO.

Read: Countries at the forefront of quantum computing

On the other hand, experts like Munster and Maynard are skeptical that Apple would follow suit. Although Apple has filed patents for flexible-display mobile devices, these filings aren’t usually indicative of Apple’s goals. There has been an increase in sales of foldable phones. However, they are still dwarfed by the shipments of normal smartphones. Lastly, there’s the matter of whether or not foldable phones actually improve the experience for users.

Share your thoughts on ” Experts say technologies like Lidar and AI are the future of iPhones”

Leave a Reply