I'm talking about the charging brick, not the USB C port on the Mac itself, but the USB C Brick
Oh, yeah that's no problems at all. The brick will do a data handshake with the phone, determining the phone's required voltage, and supply that. The current/power will be limited by the phone's resistance across it's charging pins. Thus the phone draws the current, rather than the charger pushing the current. The charger is capable of supplying current/power up to its max rated current/power, but how much it actually supplies is determined by the phone.
There are two basic formulas to work this stuff out:
V = IR
P = VI
Where:
V = voltage (V)
I = current (A)
R = resistance (Ω). NB: Ω is also spelt, and pronounced, ohms.
P = power (W)
Chargers supply a fixed voltage. Devices being charged supply a fixed resistance.
Chargers also have a rated maximum power and/or current. E.g. a 20W charger. However, this is merely the maximum it can supply. How much it supplies depends on the device being charged, or more specifically, its resistance.
The actual current/power supplied can be determined by the above equations.
To give you an explanation that might make sense of it. Take the power supply to your home, and various light bulbs of various power.
Here in Australia, our standard home power circuits are 240V (voltage) rated up to 10A (current). So the power lines coming into the house supply a pretty much constant 240V. They can actually carry a truck ton of current, but that is limited by two things: the circuit breaker / fuse to each power circuit in the home, which is for safety; and the resistance across the devices that draw the power.
Say you have a lighting circuit, and all the lights are switched off. In this case the resistance across the circuit is infinite, and thus the current supplied is zero.
Now say you switch on a 60W light bulb. What that light bulb actually is, is a resistor, which glows when current flows through it.
How much current flows through it, and how much resistance does the light bulb have? Well we can calculate that from the equations above.
We know that V = 240V, and P = 60W.
P = VI, thus
I = P/V = 60/240 = 0.25 A
V = IR, thus
R = V/I = 240/0.25 = 960 Ω
So what we actually have is a fixed voltage of 240V, applied across a 960 Ω resistor, which results in 0.25A of current flowing through the circuit, using 60W of power.
What about a 30W light bulb?
I = P/V = 30/240 = 0.125 A
R = V/I = 240/0.125 = 1920 Ω
So comparing the two light bulbs:
A: 960 Ω -> 0.25A -> 60W
B: 1920 Ω -> 0.125A -> 30W
So the LOWER the resistance, the HIGHER the current/power drawn.
So even though the circuit is rated to supply 10A / 2400W, each light bulb only draws what it is designed for, simply because of the resistance across its terminals.
Same thing for all Apple chargers, phones, and laptops etc. The chargers have a max power/current rating, but only actually supply what the device being charged is designed for.
You might want to be careful of chargers from unknown brands, but there are plenty of well known brands that are also good to go, e.g Anker, Belkin, etc.
Note also that you can also use a lower rated charger on a higher rated device, and all that happens is that it will charge slower. E.g. you can use a 20W charger with a 16" MBP which is rated to handle 140W of charging. It will simply charge at the slower 20W than if you used a 140W charger.
I sure hope you actually bothered to read all that ha ha, and that it made sense, and now you understand why you can use any Apple charger to charge any Apple phone or laptop and it won't be damaged.
FYI - even though I work as a software dev, I have a BEng (electrical/electronics) degree.