Search results for: “Apple Vision”

  • Apple Changes How You Can Buy AppleCare+: Now favors subscriptions

    Apple Changes How You Can Buy AppleCare+: Now favors subscriptions

    Apple has recently updated its AppleCare+ program, shifting the way customers can purchase protection for their devices. According to reports from tech journalist Mark Gurman, Apple is moving away from one-time payment options for AppleCare+ when you buy it in stores or directly from your device.

    Now, if you want AppleCare+ for your iPhone or other Apple products, you’ll mostly have to opt for a subscription plan that charges you either monthly or yearly. For instance, for an iPhone 16 Pro, a monthly subscription would cost you $13.99, while the previous option to pay $269 for two years upfront is no longer available in physical stores or through the device’s settings.

    Apple’s customer service representatives are now telling customers that this switch to subscriptions helps reduce the initial cost of protection and makes sure there’s no break in coverage. This change also pushes customers towards the more comprehensive Theft and Loss plans, where you can replace a lost device for a fee.

    From now on, the primary way to get AppleCare+ is through these subscription models. However, there’s a small exception: if you’re buying your product online from the Apple Store, you can still choose to pay for AppleCare+ in one go during the checkout process.

    For those looking at annual subscriptions, there’s a bit of a saving. For example, AppleCare+ for the Vision Pro can be $24.99 per month or $249 if you pay for the whole year.

    So, if you prefer not to deal with monthly or yearly payments, your only option is to purchase your device and AppleCare+ together online. Otherwise, in physical Apple Stores or via your device’s settings, subscriptions are the way to go.

  • Apple stops making AR Glasses for Mac

    Apple stops making AR Glasses for Mac

    Apple has decided to stop working on augmented reality (AR) glasses that were meant to work with the Mac, according to Bloomberg’s Mark Gurman. The idea was to make glasses that look normal but could show AR images.

    Apple wanted these AR glasses to be powered by the Mac because the glasses themselves couldn’t hold the big chip needed for AR without getting too hot or heavy. The goal was to have the glasses perform like an iPhone but use much less power. But, they found out that connecting the glasses to an iPhone wasn’t practical due to battery life issues, so they tried using the Mac instead. However, Apple’s leaders didn’t think this was a good long-term plan, so they ended the project.

    These AR glasses were lighter than Apple’s Vision Pro headset, didn’t need a head strap, and didn’t show the wearer’s eyes on the front. They also had lenses that could change color depending on what the user was doing, like signaling to others whether they were busy.

    Apple has been talking about AR glasses for nearly ten years, but the technology isn’t there yet for the kind of glasses they want. Back in 2023, Gurman mentioned that the AR glasses were somewhat of a running gag among the team, working on it just to keep CEO Tim Cook satisfied. In 2017, Cook admitted that the tech for good-quality AR glasses didn’t exist, and it seems that’s still true.

    Even though they’ve stopped the glasses project, Apple is still working on new versions of the Vision Pro, hoping to return to the AR glasses idea when the tech catches up. The glasses were supposed to use tiny projectors to show images and videos to the wearer, and Apple continues to develop special microLED screens that could be used in future AR glasses.

    Meanwhile, Apple’s competitor, Meta, is making its own AR glasses called “Orion,” which are still in the early stages and very expensive to produce. They’re planning to launch them by 2027, the same year Apple had originally aimed for its now-canceled glasses.

  • Exciting updates in iOS 18.4: Seven new Apple Intelligence features

    Exciting updates in iOS 18.4: Seven new Apple Intelligence features

    Apple Intelligence has been making waves since its debut in iOS 18.1, and with the upcoming iOS 18.4, there’s even more to look forward to. Here are seven key features and changes that are on the horizon.

    Siri’s New Powers: Seeing What You See

    Imagine Siri understanding what’s on your screen just like the Vision Pro does. With iOS 18.4, Siri will gain this ability, making interactions much smoother. For instance, if you see a new address in a message, you could just tell Siri to add it to your contacts, and it would happen without any extra steps.

    Siri’s Expanded App Abilities

    With the new update, Siri will be able to do lots more without needing to open apps. It can perform actions like finding and editing a photo, then moving it to a specific folder in the Files app, all through voice commands.

    Siri Knows You Better

    Siri will now have a deeper understanding of you, similar to a personal assistant. You could ask Siri to find a recipe sent by a friend, and it would search through your emails, messages, and notes to find it. It can also retrieve personal details like your passport number or check your calendar.

    Smart Notification Prioritization

    Apple Intelligence will make your notifications smarter by highlighting the most urgent ones at the top of your list. This means you’ll catch the important stuff without sifting through less relevant alerts.

    Image Playground Gets Sketchy

    The Image Playground app will finally introduce the ‘Sketch’ style option that was promised but missing in earlier versions. This adds another creative way to produce images with AI.

    Apple Intelligence in Europe

    Good news for European users; Apple Intelligence features will expand to iPhones and iPads in the EU, starting with iOS 18.4 in April.

    More Languages for AI

    Apple is set to support more languages for its AI features, including Chinese, various forms of English, French, German, and several others, making Apple Intelligence more accessible globally.

    Once iOS 18.4 rolls out, we’ll see all the promised Apple Intelligence features from last year’s WWDC become a reality. With these updates, Apple continues to push the envelope on what AI can do for you, setting the stage for even more advancements in iOS 19.

  • When will we see Apple’s new budget iPhone?

    When will we see Apple’s new budget iPhone?

    Apple introduced its high-end iPhone 16 series last fall, bringing lots of new technology, including Apple Intelligence. But soon, there’s going to be a new, cheaper iPhone with similar cool features. Here’s when you can expect the iPhone SE 4 to come out.

    When Will the New iPhone SE Come Out?

    Rumors suggest that Apple is planning to release the iPhone SE 4 in early 2025. Specifically, March 2025 seems to be the most likely month.

    Apple doesn’t usually launch big products in January or February, except for the Vision Pro last year. However, they often have new product announcements in March or April. Given what we know about how they’re making the iPhone SE 4, March looks like the best guess.

    Here’s when past iPhone SE models were released:

    • iPhone SE 3: March 18, 2022
    • iPhone SE 2: April 24, 2020
    • iPhone SE: March 24, 2017

    The only time Apple released an SE in April was during the global health crisis, hinting that March might be more typical for these launches.

    What’s New with the iPhone SE 4?

    The upcoming iPhone SE 4 is set to be a major step up from the current model. Here’s what you might see:

    • A screen that goes from edge to edge, with Face ID instead of a Home button, and a notch at the top
    • Powered by the same A18 chip as the iPhone 16
    • 8GB of memory
    • Support for Apple Intelligence
    • A 48MP camera matching the iPhone 16’s quality
    • A USB-C port for charging
    • Apple’s first self-made 5G chip

    While this new model won’t have every fancy feature of the pricier iPhone 16, it’s expected to offer great value. The current iPhone SE starts at $429, but the new one might start a bit higher, perhaps at $499 or less, and definitely with more storage space.

    The iPhone SE 4 is shaping up to be an excellent choice for anyone looking for a lot of features without spending a fortune.

  • Play over 2,000 games on Vision Pro with NVIDIA GeForce NOW

    Play over 2,000 games on Vision Pro with NVIDIA GeForce NOW

    The Vision Pro might not be known for its gaming capabilities yet, but a new update is set to shake things up. Thanks to NVIDIA GeForce NOW, Vision Pro users can now dive into over 2,000 games, making gaming a whole lot more exciting.

    Cloud Gaming Boosts visionOS Gaming

    NVIDIA announced earlier this month that their cloud gaming service, GeForce NOW, would soon be compatible with Apple’s Vision Pro. Now, with the recent update, this support is live, bringing a significant boost to the gaming scene on visionOS.

    High-Quality Gaming at Your Fingertips

    GeForce NOW allows Vision Pro users to enjoy games in stunning 4K resolution at 120 frames per second. It also supports ultrawide displays with very little delay, and you can use your game controller. There are now more than 2,000 games available, and over 100 of these can be played for free, even without a subscription.

    All this gaming magic works through the Safari browser on visionOS, making it easy for users to jump into their games. Although it’s not clear how the latest visionOS 2.3 update impacts this new feature, updating your device before you start playing is a good idea.

    What This Means for Vision Pro Gamers

    While this update doesn’t solve the shortage of games specifically designed for VR on the Vision Pro, it does open up a vast library of games to explore right on your headset. With 2,000 new games accessible through a big Safari window, there’s plenty to keep gamers busy. Here’s hoping this is just the beginning of more gaming enhancements for Vision Pro this year.

    For more details on how to set up GeForce NOW on your Vision Pro, check out NVIDIA’s support page to get the best gaming experience.

  • Apple’s new plan for easy-to-wear smart glasses

    Apple’s new plan for easy-to-wear smart glasses

    Apple is working on a new version of its visionOS software, which currently powers the Apple Vision Pro, to make it work with smart glasses. This move is part of their plan to offer more popular augmented reality (AR) products that are less bulky than their current headset.

    Apple’s Vision Pro Challenges

    The Apple Vision Pro, which costs $3,500, hasn’t been as successful as hoped. Many people have found it too heavy to wear for long periods, too pricey, and it also tends to get hot. Since it was released, interest has dropped, and sales haven’t met Apple’s goals.

    In his newsletter, tech journalist Mark Gurman shared that Apple’s Vision Products Group is now focusing on something lighter and more like the smart glasses Meta made with Ray-Ban. However, it might take at least three years before these glasses are ready, as there’s still a lot of research needed.

    User Studies and Software Development

    Apple is actively testing how people react to different features and interfaces for these glasses at their offices. They’ve named this project “Atlas,” which is being managed by the Product Systems Quality team. This team is part of Apple’s hardware division.

    The research is happening in a secretive spot in Santa Clara, not far from their main office in Cupertino. Last year, Apple let go of some workers there, but those left are focusing on AR tech. They also have a place to test new screen technologies.

    Future Plans for Vision Pro

    Apple is not giving up on the Vision Pro entirely. They’re planning to make a cheaper version with simpler parts, hoping to sell it for about the price of their top-end iPhone, around $1,600. They wanted to launch this model by late 2024, but they’re still perfecting the design.

    Gaming Collaboration

    Additionally, Apple is teaming up with Sony to add support for PlayStation VR2 hand controllers to the Vision Pro, aiming to make it better for gaming. This partnership has been going on for a few months now.

    By focusing on these new, more accessible AR products, Apple hopes to expand its reach in the tech market and make AR part of everyday life.

  • Apple may bring AirPods with tiny cameras soon

    Apple may bring AirPods with tiny cameras soon

    Apple is reportedly considering adding small cameras to future AirPods, as Bloomberg’s Mark Gurman mentioned. In his recent Power On newsletter, Gurman briefly touched on the idea of AirPods featuring tiny cameras, highlighting Apple’s growing focus on wearable technology. However, he didn’t elaborate on how these cameras might be used.

    Rather than capturing photos, these cameras are expected to function as infrared sensors. Apple supply chain expert Ming-Chi Kuo shared in June 2024 that Apple could begin mass-producing AirPods with infrared cameras by 2026. These sensors would work similarly to the Face ID receiver on iPhones.

    According to Kuo, these advanced AirPods are designed to enhance spatial audio, especially when paired with the Apple Vision Pro headset. For instance, if someone wearing these AirPods and the Vision Pro turns their head in a particular direction while watching a video, the audio in that direction could become more prominent, offering a richer, more immersive experience.

    Another exciting possibility is “in-air gesture control,” where the infrared cameras might allow users to interact with devices through hand movements, further integrating AirPods into Apple’s ecosystem of innovative controls.

    If the production schedule remains on track, these futuristic AirPods could hit the market around 2026 or 2027, potentially marking a big step forward in wearable tech.

    Source

  • Why Apple won’t buy TikTok: A Simpler Explanation

    Why Apple won’t buy TikTok: A Simpler Explanation

    Apple has the money to buy almost anything, but TikTok isn’t something it’s likely to acquire. This decision goes beyond just the price tag.

    Although Apple has been hugely successful in many areas, it has consistently struggled with launching social media platforms. While buying TikTok might seem like a shortcut, the challenges involved make it a risky move.

    TikTok isn’t officially on the market yet, but if it were to be sold, the buyer would need to be an American company to comply with U.S. regulations. Apple could technically buy TikTok—Bloomberg estimates its value at around $60 billion. However, purchasing it would mean starting a new division from scratch, which isn’t Apple’s strong suit.

    Apple has shown little interest or ability to thrive in the social media industry. Buying TikTok wouldn’t change the fact that the platform operates in a highly competitive space. Additionally, TikTok’s current operations already face controversies, such as limited search results on sensitive topics like abortion, seemingly to align with certain political views in the U.S. If Apple owned TikTok, it would be responsible for similar censorship decisions, potentially harming its reputation.

    Another major hurdle is the heavy moderation TikTok requires. Managing content on such a large platform is expensive and labor-intensive. While some companies, like Meta, have cut back on moderation to save money, Apple would face criticism if it followed suit. If it didn’t, the cost of moderation would still be a significant burden.

    Ultimately, Apple doesn’t need the complications that come with TikTok. The $60 billion price isn’t the issue—it’s the endless problems that would follow. Instead, Apple seems to be focusing on smaller, more manageable acquisitions, as seen with its $3 billion purchase of Beats in 2014, still its largest buy to date.

    In short, owning TikTok would bring more trouble than value to Apple.

  • Apple hires new leader to boost Siri and AI

    Apple hires new leader to boost Siri and AI

    Apple is shaking things up inside its company to make Siri and its AI better, according to Bloomberg. They’ve brought in Kim Vorrath, who has been with Apple for 37 years, to lead the AI team under John Giannandrea, who is in charge of AI at Apple.

    Vorrath has a knack for managing big software projects and keeping everything on track. She’s known as Apple’s “bug fixer” and has been a big influence in the company. Before this new role, she was part of the team working on Apple’s AR/VR headset, the Vision Pro.

    This change comes right after lots of talk about how Siri didn’t do well when asked about Super Bowl scores. For a while now, Siri hasn’t been as good as other voice helpers, especially when compared to new AI chatbots.

    Apple has also been dealing with complaints about how its Apple Intelligence summarizes news, sometimes getting things wrong and confusing people. To tackle this, they’re planning to stop these summaries for news and entertainment apps in the next update, iOS 18.3, which should come out soon.

    Despite trying to make Siri better by adding ChatGPT from OpenAI, there are still issues. But Apple is working on it, with plans for more Siri improvements in the iOS 18.4 update and even bigger changes in iOS 19, where they might make Siri like ChatGPT and Google’s Gemini.

    Moving Vorrath to the AI team shows that Apple thinks AI is more crucial than their work on Vision Pro. She’s good at organizing tech teams and making their work better. In a note about the change, Giannandrea mentioned that they want to focus on making Siri work better and improving Apple’s own AI systems.

  • How Samsung Galaxy S25 borrowed from Apple’s playbook

    How Samsung Galaxy S25 borrowed from Apple’s playbook

    Fans of both Apple and Samsung often argue about who copied whom. While Apple has faced legal challenges over design, Samsung has been quite open about taking inspiration from Apple, especially with the launch of the Galaxy S25.

    Smart Features Borrowed

    We all know Apple has been slow with its AI developments. While Samsung’s phones are packed with smart AI tools, Apple’s AI features are just starting to roll out and are pretty basic. Still, Samsung couldn’t help but notice Apple’s AI offerings.

    Apple’s AI system can work with ChatGPT and is planning to integrate with Google Gemini. Samsung followed suit, making its AI system work with external chatbots, starting with Google Gemini instead of its own Bixby.

    When you use Samsung’s Gemini, you see a text box with a bright border, much like Siri. It handles both text and voice inputs, and when you highlight text, it shows options very similar to Apple’s text editing tools, allowing you to check spelling or format as a table.

    Samsung also introduced call recording, transcription, and summarization in its phone app, features already familiar to iPhone users with iOS 18. Galaxy S25 users can now search for photos by describing them, summarize web articles, and even turn photos into drawings, much like Apple’s Image Playground.

    For privacy, Samsung’s AI can work offline, similar to Apple’s approach to limit cloud usage.

    User Interface Echoes

    During the Galaxy S25 reveal, Samsung introduced One UI 7. It features the Now Bar, which shows live updates like sports scores or timers, much like Apple’s Live Activities.

    Samsung’s camera updates mimic some iPhone features from months ago, including the ability to record in log format and tweak audio focus. They’ve also adopted a version of Apple’s Photographic Styles, giving users control over image filters and tones.

    Design Similarities

    The Galaxy S25 Ultra looks strikingly similar to the iPhone 16 Pro with its flat edges and rounded corners, moving away from Samsung’s previous curved designs. The top models now use titanium, while cheaper models stick with aluminum.

    Samsung jumped the gun on Apple’s rumored slim iPhone 17 Air with their Galaxy S25 Edge, choosing style over some features like a third camera. The protective cases for the Galaxy are almost identical to Apple’s transparent MagSafe cases.

    Moreover, Samsung’s upcoming VR headset, Project Moohan, seems inspired by Apple’s Vision Pro, even in its interface design.

    Innovation or Imitation?

    While some might see this as copying, Samsung does bring its own twist to these features. Their version of Photographic Styles, for example, allows for more creative control over image composition. However, in the tech world, where both iOS and Android offer similar functionalities, it’s clear that each company builds upon the other’s ideas to enhance user experience.

    Still, perhaps Samsung could aim for a bit more originality next time around.