Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
3D is not an underdeveloped and new technology. Alfed Hitchcock did it decades ago. Disney 3D though seems to be getting it right this time around though, in my not so professional opinion. Going 3D is not expensive in the consumer aspect. For a professional, think again. Laptops can't work on 3D movies, and most workstations don't work that well from my experience with it.

I realize that but you can just buy Nvidia's special graphics cards (which aren't too pricey) and get the TV with 3D glasses bundle which will cost <$1000.

will soon become a problem because of camera's like the Red One. From what I've seen its near impossible to do 1:1 editing with it unless you throw all your money into your machine. And then you need a 4k monitor to check focus etc... $$$

I didn't intend to claim that going 4K was expensive, just that it may not become the standard.
 

LethalWolfe

macrumors G3
Jan 11, 2002
9,370
124
Los Angeles
I see but why would most pro's be worried about 4K or 3D if they have such little ground to stand on? Future proof of "just in case" I assume? Even than so, call me ignorant, but from my observations going "3D" isn't too expensive as all you need is a GTX card and a TV with glasses bundles. Than again 3D is such an undeveloped early technology there is no telling what the future will hold if it truly gets its foot in the door.
Movies shot on film are many times scanned into the computer at 4k for editing and sometimes even higher for FX work and cameras like the Red One shoot natively at 4k (and higher res cameras are in the pipeline). Shooting a live action 3D movie basically doubles the amount of footage you are working with because you are shooting w/two cameras side-by-side (one recording for each eye).

So in short the tech arms race has become so cheap, fast, and developed that even if Apple decided to create their own hardware anyway it wouldn't have that much of a leg up over x86 architecture?
Basically, yes. For example, it used to be common place for people to need hardware accelerator cards to work w/DV footage or to playback DVDs on their computers but obviously those types of devices aren't needed anymore.


Lethal
 

EssentialParado

macrumors 65816
Feb 17, 2005
1,162
48
Is this why Mac was so eager to drop the "G" line after G5 because the performance difference between G5 architecture and x86 was thin and getting thinner by every new edition?
Basically, yes. But it's also a lot to do with Motorola refusing to create chips power efficient enough for laptops.

"Grand Central Station"?
Oops, Grand Central Dispatch.

http://en.wikipedia.org/wiki/Grand_Central_Dispatch


You have a lot of questions, don't you? :p
 

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
Movies shot on film are many times scanned into the computer at 4k for editing and sometimes even higher for FX work and cameras like the Red One shoot natively at 4k (and higher res cameras are in the pipeline). Shooting a live action 3D movie basically doubles the amount of footage you are working with because you are shooting w/two cameras side-by-side (one recording for each eye).

I see. 3D doesn't sound that intensive but if 4k becomes standard I can see that being a problem.


Basically, yes. For example, it used to be common place for people to need hardware accelerator cards to work w/DV footage or to playback DVDs on their computers but obviously those types of devices aren't needed anymore.


Lethal

I agree. I mean it's like someone said, CPU won't be needed that much and it will all be done through video cards. This makes it sound like CPU's will go the way of RAM. RAM use to be highly important and going through much updates (from my knowledge) but as of now? RAM is rarely needed to be updated. Hell even in gaming there is yet a game that requires 4GB DDR2 RAM of any kind to be of recommended spec and we've just seen the first game that has a quad core processor in its recommended specs, and it isn't even necessary according to the developer (I use gaming because they have much higher baseline specs than other programs). Really the only thing that seems to be going up are video cards.

IMO in the future Mac should just make a far cheaper Mac Pro line (not actual Mac Pro's but of a different extension) that will allow the user to just upgrade the video card and RAM.

You have a lot of questions, don't you? :p

Yeah sorry I'm just curious. :)

If it's annoying you guys I promise that I'll stop if you tell me to.

EDIT - Sorry to ask another question (I looked this one up on Google I swear!) but why wouldn't Motorola make power efficient CPU's for laptops? Why would they cut their ties with such a strong force?
 

szark

macrumors 68030
May 14, 2002
2,886
0
Arid-Zone-A
EDIT - Sorry to ask another question (I looked this one up on Google I swear!) but why wouldn't Motorola make power efficient CPU's for laptops?

Motorola, who made the G4 processors, didn't have any incentive to create faster versions of the low power chips they had. Their main CPU business consisted of selling lower-speed, low-power CPUs for use in large embedded devices such as high-end network routers and automated manufacturing machines.

IBM, who made the G5 processors, didn't have any incentive to create lower-powered versions of the fast chips that they had. Their main CPU business consisted of selling high-speed, high-power CPUs for use in higher-end business servers.

Either company would have been willing to develop appropriate processors if Apple had been willing to cover the entire development cost, but Apple wouldn't spend the money.


Why would they cut their ties with such a strong force?
Apple's computers were not a large portion of the semiconductor business of either company. Losing Apple as a customer did not make much of a dent in either company's bottom line.
 

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
Motorola, who made the G4 processors, didn't have any incentive to create faster versions of the low power chips they had. Their main CPU business consisted of selling lower-speed, low-power CPUs for use in large embedded devices such as high-end network routers and automated manufacturing machines.

IBM, who made the G5 processors, didn't have any incentive to create lower-powered versions of the fast chips that they had. Their main CPU business consisted of selling high-speed, high-power CPUs for use in higher-end business servers.

Either company would have been willing to develop appropriate processors if Apple had been willing to cover the entire development cost, but Apple wouldn't spend the money.



Apple's computers were not a large portion of the semiconductor business of either company. Losing Apple as a customer did not make much of a dent in either company's bottom line.

Wow thanks.
 

LethalWolfe

macrumors G3
Jan 11, 2002
9,370
124
Los Angeles
I see. 3D doesn't sound that intensive but if 4k becomes standard I can see that being a problem.
3D is intensive in that it's twice the amount of footage to process and you have to constantly be aware of how the 'left eye' and 'right eye' are interacting w/each other so that you achieve the 3D effect that you want w/o giving the audience migraines or vertigo.

I agree. I mean it's like someone said, CPU won't be needed that much and it will all be done through video cards. This makes it sound like CPU's will go the way of RAM. RAM use to be highly important and going through much updates (from my knowledge) but as of now? RAM is rarely needed to be updated. Hell even in gaming there is yet a game that requires 4GB DDR2 RAM of any kind to be of recommended spec and we've just seen the first game that has a quad core processor in its recommended specs, and it isn't even necessary according to the developer (I use gaming because they have much higher baseline specs than other programs). Really the only thing that seems to be going up are video cards.
RAM is still very, very important. The 4gig limit that 32-bit apps have is actually a barrier professionals are waiting to be lifted and apps like AE and Compressor currently have workarounds to get around that limit. Heavy rendering in AE or Compressor can easily eat up 16gigs of RAM. Also, having more RAM means you can run more programs at the same time w/o bringing the machine to a crawl. The more powerful computers become the more powerful things people use them for.


Lethal
 

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
The more powerful computers become the more powerful things people use them for.

I agree with this. But I think in terms of "getting things done" some things will plateau. (that multi-tasking thing is a good example, yes it is very nice but is it really a necessity?)
 

LethalWolfe

macrumors G3
Jan 11, 2002
9,370
124
Los Angeles
I agree with this. But I think in terms of "getting things done" some things will plateau. (that multi-tasking thing is a good example, yes it is very nice but is it really a necessity?)
It is if your are a professional and time is money. Heck, I remember how overjoyed I was a few years ago when I got on a Mac Pro and was able to render out an animation in AE, create an MPEG2 file for DVD in Compressor and keep editing in FCP w/o FCP feeling slow or laggy.


Lethal
 

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
It is if your are a professional and time is money. Heck, I remember how overjoyed I was a few years ago when I got on a Mac Pro and was able to render out an animation in AE, create an MPEG2 file for DVD in Compressor and keep editing in FCP w/o FCP feeling slow or laggy.


Lethal

That is a VERY good point. Oh well at least RAM is getting cheaper than it previous was. I just wish Mac would make far cheaper Mac Pro's. Like one's for the "inbetween" consumer and professional market. I really want to use a Mac for media creating after literally going through hell using Windows, but they are just too expensive. :(

Anyway thanks for all of your posts Lethal they have all been very helpful. :)
 

romanaz

macrumors regular
Aug 24, 2008
214
0
NJ
I realize that but you can just buy Nvidia's special graphics cards (which aren't too pricey) and get the TV with 3D glasses bundle which will cost <$1000.


.


I think you missed what I said. I said for a consumer turning their computer into a 3D machine is easy. For a professional, as someone has also stated, will take more power and the likes. Its basically doubling the image and movie data. Doing that at HD right now will slow down all but the highest of the high-end machines.
 

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
I think you missed what I said. I said for a consumer turning their computer into a 3D machine is easy. For a professional, as someone has also stated, will take more power and the likes. Its basically doubling the image and movie data. Doing that at HD right now will slow down all but the highest of the high-end machines.

Ahh I see. So you're talking about people who don't shop at Newegg but very specific places designed to supply new advanced technology for businesses? Thanks for clearing that up sorry for my foolish reply. :(
 

gunraidan

macrumors regular
Original poster
Jul 10, 2009
176
0
So this is what I got from this thread:

- Macs primarily have such a strong hold in the media creation market because for the longest time they were THE computers to own if your in that field. Now after Macs are the same as PC's in architecture they are still dominant as they've cemented themselves in that market. I see this similar to how Adobe dominates so many media software markets, it doesn't matter if some of their software is FAR inferior to their competitors (I.E. Flash CS4 in 2D animation) it's what most people use and thus is a standard for that industry.

- OS X is more stable than Windows. This may sound very minor, but during hectic situations and strict due dates having something safe and stable is even more important than merely having power under the hood.

- While power is very dire in media creation it isn't like gaming where if you wait 2 years or so and upgrade your processor and video card you will get a significant boost in your software, in media creation it won't do much good. The only way to get a noticeable boost in performance with the majority of software would be to go through significantly major upgrades which is why Apple hardware use to evolve by "generations" and not the "little by little" margin. So people will most likely go toward what they are comfortable with as well as what looks more professional which is usually a Mac.


In short it seems that Apple is focusing on improving performance through software more so than hardware to try to get back the "superiority" of Macs. If I recall isn't Snow Leopard suppose to be a main attraction toward developers instead of consumers hence the "minimal" updates?
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.