Apple has your credit history and personal history from Apple pay and credit check when you do the monthly payment plan.
From my reading of the architecture, Apple Pay only involves Apple servers if you are paying within an app, versus using NFC. In that way, Apple is acting as a merchant network, much like networks that serve local stores. Authorization of NFC payments is done entirely via EMV which includes the bank and card network, and merchant network, but isn't looping Apple into the process at that point.
Apple tracks where you are physically (look up consolidated.db ios), where you drive and park (new Apple maps feature copied).
You do know that DB is a local cache so your phone doesn't need to ping external servers to do things like wifi-based location pinpointing nearly as often, yes? They stupidly left it plaintext in backups which does leak private information, but the DB itself is local to make it harder to track (since fewer lookups require updating the cache) without access to the DB itself via an exploit or physical access. And even with access to the DB, until more recent versions (popular places, etc) they didn't get anything interesting beyond general areas that the phone has been, and in many cases, I can get more out of your address book, such as a specific location of your home and work, which popular places will only roughly hit a lot of the time.
Now, if people are noticing that the phone home packets are delivering this information over to Apple, then that's another matter. But the DB itself is kept local specifically to make such tracking harder, not easier.
Yup, this part is called "telemetry" or "usage data and diagnostics" depending on what the company calls it. This is not a new practice, and you can turn it off (if you are paranoid, I assume you already have it turned off, I hope). But in general, the goal is to collect aggregate data that tells you how devices are used in a general sense. Having specific users' data is actually not all that interesting, but knowing that 80% of your users interact with a feature that you thought would be niche is useful for targeting future work. Even more so if you find out what errors people are actually hitting most (not just crashes). And it only requires a representative sample of the larger community to work, so there's no reason to leave this stuff on if you don't like the idea of carrying around the software equivalent of a Nielsen Box. Because when you turn that button on, that is what you are doing, turning your phone into a Nielsen Box.
Also the news stories of differential privacy all say it is theoretical and no one has done it before, meaning statisticians and scientists say it is possible but depends on how the software is made. You trust Apple to make bug free software?
Well, the math behind the concept is fairly sound, and the concept itself is fairly interesting. It is still a step in the right direction beyond "Hey, let's harvest your information for these suggestions, no matter how identifiable it might wind up being" and only giving you the option to say "yes/no" to having identifiable details being sent back.
But yes, bugs here could be bad. I would hope they would black box test the module or architect it so that if we are doing predictive word guesses, the input is always something like "1 word" and the output is "a dozen words, one being the input, the other 11 being random words". It's harder to introduce a privacy breaking bug that way when the inputs and outputs are strict. Something like this is also "tunable" in the sense that you can decide how much trends will stick out over the noise. Tune it more towards privacy, and you will miss more subtle aggregate trends. Tune it more towards seeking out as many aggregate trends as it can, and you lose privacy.
Either way, I do hope that Apple offers a switch (or ties it to the usage/diagnostics switch) for features that rely on differential privacy, for those that aren't willing to jump into this particular world head-first.