Yeah, its interesting.
I mean, as far as displaying it, they could do what they currently do with expose, and treat each screen separately, but they seem to be taking the idea of gestural controls as a separate input to the mouse cursor.
So it seems like although they could treat it separately and simply activate the gestures from whichever screen the cursor was currently on. I don't think they want to do this. They seem to be tying these gestures into the core OS rather than having them as an attchment to the pointer.
I'll be interested to see how they get around this issue. And also, I really REALLY hope they introduce a dual screen menu bar system!!