There are two types of ML mentioned here that are getting conflated. Being able to use ML in your app is very distinct from the use of search data from one app in another app. The latter feather is Siri knowledge about the user that is pulled from the user's data and interactions in first party Apple apps, shared in a private knowledge graph generated on-device and shared across devices with end to end encryption. It's Apples way of combating the cloud based knowledge graphs that are exposing private user data.
The CoreML feature, OTOH, is just enabling 3rd party developers to easily integrate things like machine vision, pattern/speech recognition, and other basic ML capabilities very easily.