Apple has officially launched Apple Intelligence features, calling this a “new era” for its devices. After months of beta testing, these features are available to everyone through the iOS 18.1 update. So, the big question is: How can you get Apple Intelligence on your iPhone?
This rollout comes at a key time, as Apple has been working to catch up in the AI space. Since OpenAI’s ChatGPT sparked an AI craze in late 2022, the Cupertino giant has been relatively quiet. Apple Intelligence represents Apple’s move to show it can hold its own in AI innovation. Let’s dive into everything you need to know about Apple Intelligence, including how to get started with these new features.
The free iOS 18.1 update, which brings Apple Intelligence to your device, started rolling out on Monday. It’s a gradual release, so if you don’t see it immediately, don’t worry! Once it’s ready, you’ll get a notification, or you can check yourself by going to Settings > General > Software Update. Make sure you’re set up with iOS 18.1!
While this update can install on the iPhone SE (2nd gen), iPhone XR, XS, XS Max, and any iPhone from the 11 to 16 series, Apple Intelligence features are only available on the iPhone 16 lineup and iPhone 15 Pro and Pro Max due to memory needs. If you’re running low on space, there are tips for freeing up storage on your iPhone, iPad, or Mac.
After you’ve installed iOS 18.1, just head to Settings, tap on “Apple Intelligence & Siri,” and switch on the features to try out Apple Intelligence. You’ll get added to the Waitlist, and your iPhone will notify you as soon as Apple Intelligence is good to go. From my testing, this whole setup only takes a few minutes.
That’s all there is to it! You’ve got Apple Intelligence up and running, ready to explore AI specifically for Apple devices. These features are Apple’s first venture into generative AI, so go ahead and give them a try to see what you think.
Before we get into what it’s like to use Apple Intelligence, here’s a quick heads-up. A few of the features Apple introduced earlier this year, like creating emojis and images with AI and the iPhone’s ChatGPT integration, didn’t make it into iOS 18.1. The ones that did are still in beta, so they’re not fully refined yet. Let’s break down what Apple Intelligence has to offer so far.
Apple’s AI is there to help you with your writing, whether it’s emails or notes. You can use it to finish your sentences, fix grammar errors, or adjust the tone. There’s a feature called Writing Tools that you can find in various apps. In Notes, for instance, there’s a button to get help quickly. For other places, just highlight the text, and you can select Writing Tools from the pop-up menu.
Siri is getting a fresh look! The screen now lights up around the edges, making it way more eye-catching. Siri can help with questions about Apple products, troubleshoot issues, and even has a more natural-sounding voice. It’s also finally catching up when you hesitate or want to rephrase something. And guess what? You can type your requests to Siri now, not just talk!
Apple says Siri will be better at understanding context, so you can ask something and follow up without any issues. I’ve tried out these new skills, and a few weeks back, when I asked it to set an alarm for 9:50 a.m. and then switched to 8:50 a.m., it got a bit mixed up and asked which one to change. But today? It switched the alarm without a hitch! It feels like Siri’s made some big strides in understanding what you need.
Just like the image tools from Google and Samsung, Apple’s stepping up its game! Now, you can remove objects from your photos. You can even whip up movies just by using the keywords you choose. Plus, if you’re searching for a specific photo in your gallery, you can describe what you’re looking for to find it easily!
Notification summaries aim to make your notifications more helpful. Instead of just showing the first line of a text or email, Apple Intelligence gives you a little summary of the message. I can see how that could be handy, but honestly, I mostly found it unnecessary and sometimes pretty funny!
As a journalist, I can say that transcribing audio is super tedious and a bit cringe-worthy. That’s why I’m really glad Apple added AI-powered voice transcription in iOS. Just think of all the ways it could help, like transcribing lectures or supporting people with hearing impairments. After you record an audio note in the Notes app, you can even see a summary of the recording.
Apple’s initial launch of generative AI is somewhat limited and seems crafted to steer clear of the issues that have affected other companies introducing similar tools.
Some of the most anticipated features won’t arrive until December, according to Bloomberg’s Mark Gurman. This includes integration with ChatGPT, tools for editing images, creating custom emojis, and automatically sorting messages in the iPhone’s email app. Additionally, an upgraded Siri and support for devices in the EU are expected to come in April.
The first steps toward an AI future have arrived, and it’s undeniable that certain features that Apple Intelligence has unveiled—a smarter voice assistant and a nearly seamless voice transcription—are real advancements.
In the end, how useful Apple Intelligence is for you depends on your personal preferences. Personally, I enjoy going through photos to find the right ones for my projects. If you’re curious about the growing influence of AI in consumer tech, check out Gadget Flow’s AI Gadgets catalog!