Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Again, it entirely depends. If a professor requires FCP7, it would be the OP's own fault for not using the assigned program. It would be the same deal with the prof required Premiere and the OP used FCP7 instead.
This is actually my point (OP hasn't yet stated if the platform and applications are required or not, let alone which ones, as I expect is the case).

I'm hoping the OP will state the requirements, or take a closer look to find out before a mistake is made, and then face potential consequences for a bad decision.

FCPX != FCP7
I'm not talking about FCP X vs. FCP 7. Step back a moment, and think more broadly, as the platform used by the university may not even be OS X.

I'm actually concerned that the OP is trying to implement an OS X system when the school will be teaching under Windows (where equivalent applications due to the different platform are quite possible; such as the school is using Avid under Windows, and the OP tries to use either FCP 7 or FCP X; or even CS5 for OS X for that matter; just that the software isn't identical suites between what the OP intends to use vs. what the university has decided to teach with).

I don't see why they would be different. That's a processor issue, not an OS issue, and we're all on the same processors now. Compilers can be slightly different, but with vector processing and GPU acceleration, most algorithms are being run on hardware that is not compiler dependent.
No. Math is math.

If one application use a different algorithm than another, the output will not be identical (i.e. A+B = C for one application, and another is using A+C = B+4, as they weren't after true values in the first place for the function - it's about visual appearance, not technical accuracy). Simplistic, but it does demonstrate the point in terms of algorithms used may not be identical where numerical accuracy isn't critical (goal might be to create a smoke effect, not balance every credit card transaction passing through the Visa network for example).

Where this can be important however, is when the professor has written a comparative algorithm as a means of grading the student's work. In such a case, it will show errors that would reduce the grade (I mention this, as it's sometimes used as a means of grading by various Science, Engineering, CS, ... sorts of disciplines).

As per UI differences, I don't use it, but looking at screen shots, it does look different to me (don't expect it to be like learning a totally different spoken language unlike anything in previous experience sort of difficult). But it could still cause a learning curve for a student that chose such a path.
 
I'm actually concerned that the OP is trying to implement an OS X system when the school will be teaching under Windows (where equivalent applications due to the different platform are quite possible; such as the school is using Avid under Windows, and the OP tries to use either FCP 7 or FCP X; or even CS5 for OS X for that matter; just that the software isn't identical suites between what the OP intends to use vs. what the university has decided to teach with).

Again, I'm not sure this is cause for concern.

1) Macs are widely used for video editing. The software is identical. I know it is. I've used it.
2) You're already assuming the school teaches Windows, and further making the leap that it would be Mac unfriendly.
3) You're talking about output when identical output can't even be guaranteed from one display to another, making it completely a non issue.

Really. Macs are big enough in video editing that this is entirely a non issue.

No. Math is math.

If one application use a different algorithm than another, the output will not be identical (i.e. A+B = C for one application, and another is using A+C = B+4, as they weren't after true values in the first place for the function - it's about visual appearance, not technical accuracy). Simplistic, but it does demonstrate the point in terms of algorithms used may not be identical where numerical accuracy isn't critical (goal might be to create a smoke effect, not balance every credit card transaction passing through the Visa network for example).

a) Why would they change the algorithm between platforms? It's C code. It works on both platforms. This is a not an issue. I've worked on video software code before that's cross platform...
b) Even if the algorithm is the same, the output can be different depending on how different compilers handle rounding. I've seen this in the industry, it's also considered a bug in the software. Testing is generally done to make sure output is identical.

Where this can be important however, is when the professor has written a comparative algorithm as a means of grading the student's work. In such a case, it will show errors that would reduce the grade (I mention this, as it's sometimes used as a means of grading by various Science, Engineering, CS, ... sorts of disciplines).

Again, you CAN'T do this in video editing. I've worked in video editing software, it can't be done. Displays don't at all display colors the same. Even individually video players won't display the exact same colors.

You'd have to actually unpack the frames and compare them, and I never, ever, ever, ever have heard of that being done in film editing programs. Because no one really cares. It either looks right, or it doesn't. It's not a super exact science. It's not like you move one bit off and suddenly it's all wrong.

As per UI differences, I don't use it, but looking at screen shots, it does look different to me (don't expect it to be like learning a totally different spoken language unlike anything in previous experience sort of difficult). But it could still cause a learning curve for a student that chose such a path.

They aren't. Really. I don't know where you're getting this from. They're exactly identical. I've actually spent time using them across platforms.

Even if any of this were an issue (which it isn't, again, the video industry is very Mac friendly), the OP could still load Windows on his Mac Pro and it would be a very capable rig.
 
Again, I'm not sure this is cause for concern.

1) Macs are widely used for video editing. The software is identical. I know it is. I've used it.
2) You're already assuming the school teaches Windows, and further making the leap that it would be Mac unfriendly.
3) You're talking about output when identical output can't even be guaranteed from one display to another, making it completely a non issue.

Really. Macs are big enough in video editing that this is entirely a non issue.
We don't know what the situation actually is, so it was worth mentioning IMO (maybe the professor in charge of the Dept. is a stickler; maybe hates OS X, ....), regardless of how Mac friendly the creative market is.

a) Why would they change the algorithm between platforms? It's C code. It works on both platforms. This is a not an issue. I've worked on video software code before that's cross platform...
I was thinking in terms of different products from different developers (say CS5 vs. Avid for example), not the same product ported to multiple platforms.
 
We don't know what the situation actually is, so it was worth mentioning IMO (maybe the professor in charge of the Dept. is a stickler; maybe hates OS X, ....), regardless of how Mac friendly the creative market is.

And that's possible, but so is the exact opposite. And I find it hard to believe that one could be in video editing classes where the entire class is on Windows. And again, worst case, OP can dual boot to deal with theoretical single professor who somehow has an entire class on Windows.

I was thinking in terms of different products from different developers (say CS5 vs. Avid for example), not the same product ported to multiple platforms.

Sure, but again, it doesn't matter. Color correction isn't a series of exact steps. Either the output looks good or it doesn't.

If a professor is teaching to an exact specific tool, then yeah, you should use that tool. Otherwise, so what? Color correction is about training your eye, not following a bunch of exact steps to get identical output, or learning how to move a slider into an exact position. People use all sorts of tools in industry.

Regardless, there are general formats for interchanging color correction data to produce the same output in different tools. But I think this entire discussion is getting ahead of itself. OP wants a Mac Pro, and there isn't any reason at all why it wouldn't be a great machine.
 
as a former film student, i can def say that the professors didn't make it a prerequisite for the students to own any kind of software and/or computers that were taught because the school facilitated all the necessary equipment in their computer labs and editing rooms. so, it was just up to the students to find the time to go to the labs and/or sign up for an editing room to complete any assignment. it helped that i lived near campus so i got by without ever owning the programs that we were taught (final cut pro, avid and premiere pro) to us.

just saying this because the school, depending, should facilitate the students with the necessary equipment to complete their work. it helps having your own equipment, of course, but when i was going to film school, avid was super expensive and it wasn't until later during senior yr when final cut pro 1 came out (shows how old i am) that it became affordable to buy your own editing computer and some of the students who can afford it did this to complete their senior thesis project. i used the school's compeer to complete mine.

also, there are so many things more important anyway when you are in college to worry about what computer to get and as a film student, i feel like owning a computer is really not the #1 priority when the school (again, depending) should have the facility.

just my 2 cents.
 
as a former film student, i can def say that the professors didn't make it a prerequisite for the students to own any kind of software and/or computers that were taught because the school facilitated all the necessary equipment in their computer labs and editing rooms. so, it was just up to the students to find the time to go to the labs and/or sign up for an editing room to complete any assignment. it helped that i lived near campus so i got by without ever owning the programs that we were taught (final cut pro, avid and premiere pro) to us.

And this. If a program does mandate something, generally there are labs available.

As a CS student, it was always nice having my own rig at home, but if something required an exact tool, it was always provided by campus.
 
A 1,1 octocore will do everything you need and more including serious gaming. If you can pick one up cheap ($500 or less) its worth it since it will not eat too much into your budget for future system. The other option is to wait for Sandybridge and spend for top of the line. It will last you a good 5 years and MPs ARE expandable so it will be worth it in the long run.
 
I just got the '09 single-cpu model (2.66 GHz/W3520/3GB/640GB/GT120-512MB) with all original accessories for $1650. I think it's a good deal.

In quite a few ways, this model is the sweet spot in the used MacPro lineup right now.

It doesn't require expensive RAM modules like the '06 and '08 machines and it sports Intel's most current (X58) technology. Therefore it can be easily upgraded with the latest CPU's, and adding RAM and storage doesn't have to cost a lot.

For about $300 I can drop in a W3565, which makes it nearly identical (except the GPU) to the current $2899 model for under 2 large. I use the machine primarily for audio and the GT120 suffices.

I ordered 12GB of RAM and 3 extra HDD's for a little under $300. With all of that I got myself a neat litle machine that should serve me well for a few years!

Finally, with the firmware hack, hexacore goodness can be had, although I supect it will affect resale value.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.