
How Apple's AI tags could help find apps more easily
What's the story
Apple is now employing artificial intelligence (AI) tagging techniques to improve app discoverability on the App Store.
The feature is available in the developer beta build of iOS 26, but isn't yet visible on the public App Store or influencing its search algorithm.
The move comes amid concerns about how such changes could affect an app's search ranking within the store.
Discoverability concerns
How app metadata influences search rankings
A recent analysis by app intelligence firm Appfigures has indicated that metadata from an app's screenshots may be affecting its ranking.
The company speculated that Apple was pulling text from screenshot captions, a process it thought was using Optical Character Recognition (OCR) techniques.
However, Apple clarified at its Worldwide Developer Conference (WWDC) 25 that the data extraction process uses AI, not OCR methods.
Tagging strategy
Categorizing apps in a better way
At WWDC 25, Apple revealed its plan to use screenshots and other metadata for improving app discoverability.
The company said it employs AI techniques to extract information that would otherwise be hidden in an app's description, category information, screenshots, or other metadata.
This process allows Apple to assign a tag that better categorizes the app without requiring developers to add keywords or take other steps to influence the tags.
Developer control
Tags will go through human review process
Apple has promised developers that they will have control over the AI-assigned tags associated with their apps.
The company also assured that humans would review these tags before they go live on the App Store for the public.
As this feature rolls out to global App Store users, it will become crucial for developers to understand which tags can enhance their app's discoverability.