This is one of the things that comes up with VR as well - whether people see it as an authoring environment / workspace, or as a playback environment. Screen-based authoring environments for AR more or less already exist with Unity & Unreal, and could easily be worked up as an adaption of the sorts of workflows that are used for Quicktime VR style virtual tours. It's unlikely Apple has anything to bring to that space, as it's always going to be inherently cross platform / lowest common denominator.
For examples of an AR-based toolset, where you author in the same environment you're working, go look at Leap Motion's youtube channel, because they've hit the nail on the head, that the first thing you need is precise hand tracking, and depth-mapped object occlusion (they're waaaay ahead of Magic Leap on that front).
I don't think the "make it on a screen, deploy it on a headset" paradigm has any real future - once you start doing work in a full body 6-DOF environment, sitting at a desk simply isn't a fulfilling way to spend time. The people who are into the deployment environment enough to develop content for it, are also going to want their tools to be in it.