iOS 16 sends iPhone 14 back to 2012 with old Android features and 2032 with amazing AI tricks

Written by admin

It’s a bizarre “old meets new” paradox that only Apple can make possible, thanks to its stubborn approach to iPhone customization over the years and Cupertino’s growing power to make super-advanced software, hardware and AI work together seamlessly.

iOS 16 – Live Wallpapers (Xiaomi)

Live wallpapers are another feature clearly inspired by Android, as the new animated Earth wallpapers draw inspiration from China’s Xiaomi and MIUI 12, which first introduced Super Wallpapers — a two-year-old party trick that Apple clearly liked enough to get to. bring the iPhone.

iOS 16 – Photo Shuffle and Themes with a choice of colors (Huawei and Google)

The new color customization and Photo Shuffle wallpaper options, where the iPhone can show you a brand new wallpaper every time you unlock it, has also been around for at least five years on Android, thanks to Huawei’s Magazine Unlock. And the new iOS color picker is clearly inspired by Google’s Material You design that debuted with Android 12 and Pixel 6.

iOS 16 – Keyboard Haptics (Android)

The new Keyboard vibration, or “Keyboard Haptic,” as Apple calls it, will probably be my personal favorite iOS 16 update because it allows the iPhone to give you vibration feedback as you type.

This was a huge missed opportunity for Apple to take advantage of the very best Taptic motor/vibration motor from the iPhone, but now it’s here. Needless to say, Android phones have supported keyboard vibration for over a decade.

But iOS 16 also brings great new features that Samsung and Google will be jealous of

Sure, we could label iOS 16 as the “Android update,” but that’s only fair to a degree — customization. That’s about it…

However, Tim Cook & Co also made sure we got a handful of surprises, and you better believe a few absolutely blew me away. Apple’s super-powerful A-series chips and extreme software expertise meet AI and Machine Learning. That’s how I would describe it. Let’s see!

iOS 16 new Sharing features – sharing means caring (and working together)

That’s a big one! I would describe the new sharing features in iOS 16 as: teleportationbut there’s more to it…

For starters, the new iCloud Photo Shared Library lets you share photos with friends and family, but… in a smarter way† Not just a simple shared album, but rules for smart installationwhich allows you to choose which photos to share based on the people/faces in them or the date.

iPhones with iOS 16 can also send photos you’ve just taken to a chosen friend right when you take the photo. Teleportation – am I right? You can also edit pictures together – everyone has the same rights to add, edit, favorite, caption and remove pictures from the shared album.

You can also collaborate via Shared Notes (available since iOS 15) and Safari Tabs where anyone can add their own tabs and instantly see the tab group update as you collaborate. So iOS 16 is big on sharing, communication and collaboration, if it wasn’t already obvious! But let’s move on…

Live text and live captions in video

For starters, Live Captions aren’t entirely new on phones, as Google has already introduced an equivalent feature with Pixel 6, thanks to Android 12 and Tensor.

But what we hadn’t seen before is something called Live Text in Video. It allows you to highlight, copy and paste and even perform tasks such as searching the web – all from the text in your videos. Live Text for Photos was already available on iOS 15 and Android 12, but this is a powerful move from Apple, and I’m here for it.

Remark:Apple, I’d like to have a button that allows me to toggle the feature on or off, as I found myself highlighting text in some of my photos while trying to zoom in and out. Please.

Lock Screen To Do Lists

This one is as simple as it sounds, and I have to admit I confused it with Reminders on your lockscreen at first, which may not be that far from the truth.

Anyway… iOS 16 and iPhone let you keep to-do lists on your lockscreen, which I think is a brilliant touch. I like to set reminders and to-do lists, but I hardly ever open the respective apps when I need to track my tasks, so this could be a game-changer.

Drag and drop visual lookup (create PNG stickers in a split second)

And here’s where Apple’s iOS 16 just bends in the face of the competition.

Have you ever wanted/need to make a PNG cutout of yourself or an object? Or even turn something into a sticker (some of my friends use WhatsApp sticker extensions to make this possible)? Well, with Visual Search’s new drag and drop feature, you can now select a subject/object from a photo with a long press; your iPhone immediately identifies the subject; makes a cutout and you’re ready to drop it wherever you want. Magic. No need for Photoshop or sticker generators. The kids will go crazy for this…

Desk view continuity camera could be Apple’s best magic trick I’ve ever seen

And if visual lookup is magic, the new Continuity Camera and Desk View features in iOS 16 seem pure sorcery

For starters, iOS 16 allows you to use your iPhone as your MacBook video camera. No wires, no fiddling. While this may seem easy to do, the reality is that it isn’t. But Apple definitely made it appear easy…

Suppose you wanted to perform a similar trick without a Continuity Camera… In that case, you would either need to connect your phone/dedicated camera to your laptop via a cable, or you would have to buy a special receiver that transmits a signal from your DSLR to your computer. In addition, not all video conferencing apps would be able to use the camera well, so it can very quickly very messy.

And now it seems like Tim Cook snapped a finger and made that possible without the need for external hardware or effort. Fine Apple.

Desk View is the futuristic feature we never knew we needed

Anyway – you may or may not need to use your iPhone as a MacBook webcam for better quality. But the amazing new iOS 16 feature that you will definitely want to try out is called desk view

Desk View uses your iPhone’s ultra-wide-angle camera during video calls (on Mac) to show you two perfectly independent video streams of:

  • Yourself – centered on the screen
  • An overhead view of your desk/work area

Let that sink in…

While it would make a little more sense to pull this off with both the primary and ultra-wide-angle cameras, it seems that the iPhone is only resorting to its ultra-wide-angle lens on the back to make the magic happen. This just makes it Lake magic.

The practical implication of Desk View in the context of a MacBook conversation is that you can show what you’re working on while still in the picture and collaborate more efficiently with your colleagues and friends.

But if Desk View is only powered by iPhone and iOS 16, I don’t see why it shouldn’t come to iPhone 14 as a standalone feature. Then you could use it in all kinds of wild scenarios like extracting videos, cooking videos, tutorials, etc. It could be a really powerful tool which is obviously made possible thanks to some mighty AI algorithms. Fine Apple.

About the author


Leave a Comment