Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
Status
Not open for further replies.
Man, aren't we getting behind everyone else nowadays, Apple?!
PC mobos with TB3 are showing up already and Apple is sleeping on their watch? Come on.
Apple used to be a trend setter and now only follows others?
The same is even happening with mobile phones (force touch) and watches, go figure, which seem to be the main focus of Apple these days. Are we getting lazy here? Or just slow?
Bad, bad move Apple.
 
Man, aren't we getting behind everyone else nowadays, Apple?!
PC mobos with TB3 are showing up already and Apple is sleeping on their watch? Come on.
Apple used to be a trend setter and now only follows others?
The same is even happening with mobile phones (force touch) and watches, go figure, which seem to be the main focus of Apple these days. Are we getting lazy here? Or just slow?
Bad, bad move Apple.

Typically Apple is behind in most technologies. Apple is mostly interested in doing it right the first time rather then being first. What besides eGPU's is Thunderbolt 2 is too slow for?
 
  • Like
Reactions: AleXXXa
But that gpu would never be out of date since it is perfectly paired to drive that monitor at opimal capability already (resolution & framerate). in short, on a marketing point of view that GPU would be out of date when the display would be.

Are you suggesting a laptop's gpu for driving its built in screen, or an iMac's can never be "out of date" because it's paired and optimised?

Exactly the same situation.
 
  • Like
Reactions: AleXXXa
Are you suggesting a laptop's gpu for driving its built in screen, or an iMac's can never be "out of date" because it's paired and optimised?

Exactly the same situation.
i wouldn't say it's exactly the same situation.. an imac or laptop gpu does (can do) other things besides driving its built in screen.. a gpu inside a display would do one thing only.. drive the screen.
 
i wouldn't say it's exactly the same situation.. an imac or laptop gpu does (can do) other things besides driving its built in screen.. a gpu inside a display would do one thing only.. drive the screen.

Why would it be limited to that?

CUDA works on nMP via a TB connected GPU.

Or do you think that Apple will invent another monitor connector that doesn't allow GPGPU?

More Armchair Speculation. An Akitio TB enclosure is $200, wouldn't it be fun to know what you are typing about ?
 
  • Like
Reactions: AleXXXa
i wouldn't say it's exactly the same situation.. an imac or laptop gpu does (can do) other things besides driving its built in screen.. a gpu inside a display would do one thing only.. drive the screen.

That's not the issue - If we're going to see something like this, my bets would be that it simply switches off the computer's internal GPU, so ALL the GPU tasks will be done by it. So you'll have a big LED panel (which should work fine for a decade) that becomes obsolete as quickly as the discreet / integrated graphics in the computers currently.
 
  • Like
Reactions: AleXXXa
Why would it be limited to that?

CUDA works on nMP via a TB connected GPU.

Or do you think that Apple will invent another monitor connector that doesn't allow GPGPU?
huh? talking about a display that drives itself instead of relying on the computer hardware.. like, plug an 8k display into an ipad.

why buy a display for it's egpu capabilities? seems better to keep them separate.

More Armchair Speculation. An Akitio TB enclosure is $200, wouldn't it be fun to know what you are typing about ?
sigh.. mr.armchair, the spokesman of le pros, talking about armchair_ers.. :/
 
That's not the issue - If we're going to see something like this, my bets would be that it simply switches off the computer's internal GPU, so ALL the GPU tasks will be done by it. So you'll have a big LED panel (which should work fine for a decade) that becomes obsolete as quickly as the discreet / integrated graphics in the computers currently.

You'll never get anywhere trying reason like this. He's still waiting for Godot to bring the nMP GPU upgrades that he predicted 2+ years ago. I think the rest of us have accepted that that particular ship has sailed.

huh? talking about a display that drives itself instead of relying on the computer hardware.. like, plug an 8k display into an ipad.

why buy a display for it's egpu capabilities? seems better to keep them separate.


sigh.. mr.armchair, the spokesman of le pros, talking about armchair_ers.. :/

Again, how do you connect to this? If it is TB, it is an eGPU with display at the end.

And again we have someone with 0 (ZERO, ZILCH, NADA) actual knowledge wisley speculating about eGPUs. From the comfort of his (threadbare) armchair he seeks to dictate to those who actually own a nMP and an eGPU about how they work. Brilliant. Would be like me posting on an underwater diving forum and telling people they have their nitrogen mix all wrong based on some stuff I remember from High School chemistry. Sophmoric at best.
 
  • Like
Reactions: AleXXXa
why buy a display for it's egpu capabilities? seems better to keep them separate.

Because you can misrepresent the use case for an eGPU as being "people wanted an external screen for lightweight laptops", rather than "people wanted the ability to upgrade graphics independent of the computer". My concern is how difficult it becomes to have an eGPU that's not an Apple screen, as a turnkey work solution, rather than a hobby project.

The world prior to the nMP wasn't clamouring for a machine with (effectively) 2 soldered graphics cards, yet the marketing angle was all about how they were serving the desire of customers to have "dual workstation graphics", when what people wanted was for better support of dual standard PCI cards. Again, it's the mistaken idolisation of Henry Ford, when he complained that people would have asked for a faster horse, ignoring that his car didn't fuel itself on grass, didn't produce fertiliser for yards and farmland, couldn't reproduce itself etc etc.
 
Again, how do you connect to this? If it is TB, it is an eGPU with display at the end.
i don't know how you connect it.. i'm not suggesting how to connect it.

i do know i wouldn't be interested in buying a display so i could add gpgpu power to my computer.. rather just use a box that houses external gpus..

whether or not gpu computer power is possible from a display is irrelevant.. seems like a convoluted solution for getting from point A to B.

And again we have someone with 0 (ZERO, ZILCH, NADA) actual knowledge wisley speculating about eGPUs. From the comfort of his (threadbare) armchair he seeks to dictate to those who actually own a nMP and an eGPU about how they work. Brilliant. Would be like me posting on an underwater diving forum and telling people they have their nitrogen mix all wrong based on some stuff I remember from High School chemistry. Sophmoric at best.
you're reading what you want to read because i'm not saying any of that.. if you really think that's what i'm saying then you're mistaken and we should try to get on the same page..
or you can just keep churning other's words into whatever suits your agenda.
idc
 
  • Like
Reactions: ixxx69
I give up. (Here is a hint, the "e" in "eGPU" means "external")

As in "not inside the computer" as in "outside of the computer". A display that is not physically part of a machine is usually considered to be outside, not inside of a machine.
 
  • Like
Reactions: AleXXXa
Because you can misrepresent the use case for an eGPU as being "people wanted an external screen for lightweight laptops", rather than "people wanted the ability to upgrade graphics independent of the computer". My concern is how difficult it becomes to have an eGPU that's not an Apple screen, as a turnkey work solution, rather than a hobby project.

i don't know.. when i think of egpu, i'm thinking gpgpu and not graphics.. incase that helps clarify what i'm saying.. ie- next level rendering farms etc.

so the idea of sending out data to be crunched in a display then sent back to the computer just sounds a bit wonky to me.

The world prior to the nMP wasn't clamouring for a machine with (effectively) 2 soldered graphics cards, yet the marketing angle was all about how they were serving the desire of customers to have "dual workstation graphics", when what people wanted was for better support of dual standard PCI cards. Again, it's the mistaken idolisation of Henry Ford, when he complained that people would have asked for a faster horse, ignoring that his car didn't fuel itself on grass, didn't produce fertiliser for yards and farmland, couldn't reproduce itself etc etc.

better support of dual standard PCI cards?
why?
if that's what nmp was all about, how would you be computing better right now? nothing would of changed.

there are changes happening which can be directly attributed to the nmp design.. the changes are happening in the software.
 
i don't know.. when i think of egpu, i'm thinking gpgpu and not graphics.. incase that helps clarify what i'm saying.. ie- next level rendering farms etc.

Right, but I'm betting the only "official" eGPU we'll see Apple officially Support is for display, and it'll be specific displays they sell, and they'll update as regularly as the nMP / Retina iMac does. Renderfarms exist already - it's a solved problem, unless Apple can market it as an FCPX hardware accelerator.

better support of dual standard PCI cards?
why?

So. You. Can. Change. Them. A. Year. Later. When. Better. Or. Different. Tech. Is. Released.
 
  • Like
Reactions: AleXXXa
Renderfarms exist already - it's a solved problem
it's not a solved problem.. so i have a $50,000 rendfarm with 200 cpu cores or whatever.. compare to a mid-grade gpu with a thousand cores..

renderfarms are way too expensive and inefficient to consider problem solved.


So. You. Can. Change. Them. A. Year. Later. When. Better. Or. Different. Tech. Is. Released.
ok, so you can change them..
what i asked though was "if that's what nmp was all about, how would you be computing better right now?"
 
it's not a solved problem.. so i have a $50,000 rendfarm with 200 cpu cores or whatever.. compare to a mid-grade gpu with a thousand cores..

renderfarms are way too expensive and inefficient to consider problem solved.

ok, so you can change them..
what i asked though was "if that's what nmp was all about, how would you be computing better right now?"

Like I said, if TB3 allows GPGPU appliances, fantastic. I don't believe it will, unless by virtue of it being too much work to prevent it, however. My suspicion is it'll be like running a hackintosh if it works at all.

if the nMP had standard PCI graphics cards? I'd actually consider getting one. Not that that would be necessarily better right now.
 
  • Like
Reactions: AleXXXa
Like I said, if TB3 allows GPGPU appliances, fantastic. I don't believe it will, unless by virtue of it being too much work to prevent it, however. My suspicion is it'll be like running a hackintosh if it works at all.

we're sort of talking about different things.. or youre more talking about the technical side of things and i'm more talking like i'm making a feature request as a user.

i don't care how you make it work, just make it work! :)
and really, that's how developers etc would rather hear requests.. at least when compared to "i want you to implement it by (your idea of how it should be coded/implemented/etc)
 
we're sort of talking about different things.. or youre more talking about the technical side of things and i'm more talking like i'm making a feature request as a user.

i don't care how you make it work, just make it work! :)
and really, that's how developers etc would rather hear requests.. at least when compared to "i want you to implement it by (your idea of how it should be coded/implemented/etc)

What language is this in? Google translate isn't finding anything.
 
  • Like
Reactions: AleXXXa
What language is this in? Google translate isn't finding anything.
i get it that when people talk about using computers as opposed to reporting hardware benchmarks-- you get confused.

you could try to understand what i'm saying or ask questions needed for clarification or.. or you could make up your own version of what i'm saying an report it back.. either one is ok with me.

or you could just ignore me but i think it's been proven that you won't.
 
  • Like
Reactions: ixxx69
He did try it on OSX and it didn't crash. You should take the time to read before replying.

Actually it did. The report was that it is better now that in previous version of the OS. So yes, it was not working. It works incrementally better now.

The unplug event .... Windows lauded in post and radio silent on what happen with proposed product he wants to sell. ( can probably guess what the result is. ). And so much for the whole "virtual, seamlessly fall back onto the iGPU concept. " The low hurdle criteria for "working" drivers is that there is no catastrophic kernel panic. Apps on missing GPU survive disconnect?
 
Actually it did. The report was that it is better now that in previous version of the OS. So yes, it was not working. It works incrementally better now.

The unplug event .... Windows lauded in post and radio silent on what happen with proposed product he wants to sell. ( can probably guess what the result is. ). And so much for the whole "virtual, seamlessly fall back onto the iGPU concept. " The low hurdle criteria for "working" drivers is that there is no catastrophic kernel panic. Apps on missing GPU survive disconnect?

Another armchair expert. Have you ever used an eGPU or just postulated about them?

If it takes 1,000 words, I'll know it was a "no".

It's truly amazing how many people have strongly held opinions about something then have never set eyes on. And to think for $200-300 they could actually know what they are typing so incoherently about.

But by all means, kick back in the armchair and spout some more wisdom. Some of us are stuck doing the work.

Go check that score against the ones at Barefeats, without checking I know it beats any and all Macs made by Apple.
 

Attachments

  • Screen Shot 2015-09-05 at 12.55.54 AM.png
    Screen Shot 2015-09-05 at 12.55.54 AM.png
    116.1 KB · Views: 114
  • Screen Shot 2015-09-05 at 1.00.28 AM.png
    Screen Shot 2015-09-05 at 1.00.28 AM.png
    121.4 KB · Views: 106
Last edited:
  • Like
Reactions: AleXXXa
Linux, what do you mean behind? Who was behind and came first to market on TB? Who started using USB type C on their machines? Who used FireWire first? Hi Res monitors like Retina as standard?
Do I need to go on?
They're the ones who have no problem whatsoever to try new tech and discard old legacy stuff, you like it or not.
The others follow suit, sometimes even launch their products in a rush just to say they're first but only after Apple has had their release schedule in place, and that's just for the sake of beating Apple to it and say they did it first.
In the past no one really cared much for Apple and PC manufacturers only cared in beating themselves out. Now, Apple is in the cross hairs of everyone.
They're a company to follow and even beat at all cost.
I didn't say TB2 was slow, Apple was :)

The issue of having eGPUs integrated into the monitor doesn't really cut it for me. I wouldn't need it, it's an added (unnecessary) cost, adds to the power consumption and thermal load in an already tight case, it's another part that can brake and it should not be easy to service, adds noise to a silent workstation and right in front of you...
Nah, that would be a bad idea.
If you need one use an external box. It's easier to upgrade, you can choose the GPU inside, you can hide it from sight if you want.
But in fact (I think dec said it a while back) the TBD is really a docking station and not a stand alone monitor. I'd rather have a monitor only, skip the added ports even, there's plenty of that on the back of the nMP.

MVC, did you even try Google Translate or, as usual, it was something that come out of your big mouth?! Do you put a sticker on the cards you sell saying "it works as eGPU"?
 
I have always thought the goal (my wish anyway) of an eGPU was to work the same as an internal GPU if we could provide the bandwidth with an appropriate connector (read USB-c).
Is this putting it simply enough?
Unfortunately Apple trying to make things idiot proof has made exponentially more idiots using computers who have no right getting their hands on one.
Why make an issue of accidentally unplugging your eGPU and possibly crashing the system, you are the idiot that unplugged it, so the blame lies squarely on the end users proficiency in the use of their said device. Same goes with a motor vehicle. We should take more responsibility with how we use our devices and how to use them correctly.
If you dont know what an eGPU is or what you would want it for, you dont need it. Im also pretty sure you have already found plenty of other ways to crash your system because you dont know what you are doing as an end user.
 
  • Like
Reactions: AleXXXa
you guys have great ideas but as most can see apple moves towards simplifying not the other way around.

It is NOT gonna get more complicated. Perhaps you folks need to make an appoitment with a "genious" at the apple bar so you can be kidnly reminded what kind of dumb nails is going to touch your computer haha.

Hey no offence if anyone here if you are a apple genious. I am sure there are some smart ones.
 
Status
Not open for further replies.
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.