I think the magic of pointers has more to do with how they handle things like addition.I never quite got this. What's so "magic" about pointers? The name suggests "It's not the thing, it just points to the thing". Maybe it would have helped if they had been referred to as "addresses" rather than "pointers", because that's what they really are. I believe that the "black magic" was introduced because of the teaching approach that "the actual hardware is irrelevant" and code is viewed as some abstract mathematical recipe (which I oppose to).
I always explain it like a computer memory can be compared to a (huge) number of boxes all neatly lined up, each of which can store an integer number between 0 and 255. Each of those boxes has a unique number, which we'll call its address. The computer can store and retrieve the context of each box (the number between 0 and 255). To the computer, they're just numbers, and it depends on context what those numbers mean. If a box contains the number 65, then it could mean just that - the number 65, or it could mean the letter A, or a pixel with an approximately 25% grey value, or the cpu instruction "LD H, L", or even part of something bigger like the word "Apple" or of some floating point number (which takes 4 or 8 of those boxes combined) or whatever.
Since this "meaning" is important, the programming language tries to keep track of it: "These couple of boxes together are actually one thing, namely the word Apple, or this floating point value, or an image, or a database, or an audio file, or the memory address of some other box," etc.
The "drawback" of languages which expose these pointers to the programmer is that this can "break the abstraction". For some things it's fine if you are able to say "increase the value of the number in this box by 1" - in the context of a grey pixel, it just became a tiny bit lighter; if it was the letter A then it will become the letter B. But if it were part of the executable code, and you tried to "add one to the instruction LD H, L", and computers didn't hate being anthropomorphized so much, they would probably say "stop it, you're making me uncomfortable."
That int* a; a+1 actually will add 4 to the address a holds, because it automatically knows to move forward the sizeof int. And that long long* a; a+1 will add 8 to the address. If you've not properly been introduced to how the pointed to type affects the arithmetic it can seem quite unnatural. Also I get people's confusion if they see, for example, reinterpret casts
float b = *(float*)(void*)&a;
And the syntax for function pointers has always just been insane
I've also seen some confusion relating to pass-by-value and pointers. Too many levels of indirection can get confusing.
So while people can easily get that
int a = 5;
callFunc(a);
gives a copy of a to callFunc and doesn't risk altering a, it''s more easy to get confused when you see
inf changePtr(int** ptr);
taking a double pointer. It eventually clicks for people that it's due to the pass by value thing, but it's just like having 20 nested if blocks - it eventually gets confusing.