Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
...compared to finding a place for the Thunderbolt controller physically along with where to get the extra 4x of PCI lanes per port.
That's not an issue either.

Think about the additional controllers other board makers add to ATX sized boards... Now given Apple tends not to include as many additional controllers as other board makers (high-end offerings), then fitting a TB controller and perhaps an FW800 controller shouldn't be that big of a deal. ;)

As per the PCIe lanes, it consumes lanes from either the PCIe controller that's part of the CPU die (would reduce the number available for slots), or the additional lanes available in the chipset (X79/C600 have a total of 8x Gen 2.0 lanes). This shouldn't be an issue either, particularly if they stick with 4x slots.
 
That's not an issue either.

Think about the additional controllers other board makers add to ATX sized boards..

The issue is not the main board it is the graphics card board. Putting the TB controller on it (and getting power and PCI-e bandwidth) is a problem.

The core issue here is why there should be anything at all added to PCI-e cards that deliver graphics. TB doesn't add anything to their core mission.

Multiple display channels?? already have them ( multiple DP and/or DVI / VGA / HDMI ).

Display Port output ? already have them.

Dispaly Port v1.2 output ? TB can't even do it. AMD Raedon cards... been there done that already.

http://www.engadget.com/2011/12/09/amd-radeon-hd-6000-cards-receive-vesa-displayport-1-2-certificat

. Now given Apple tends not to include as many additional controllers as other board makers (high-end offerings), then fitting a TB controller and perhaps an FW800 controller shouldn't be that big of a deal. ;)

On the mainboard sure. (on the processor daughtercard is a bit of a hassle. ) But there core problem is that the display port signals on the PCI-e graphics card are coming out of the edge on the card. They don't go to the mainboard. In fact, there isn't really a good reason to send them there.

A far, far, far, more simpler solution is to put an embedded (e.g., like on iMac) graphics solution on the mainboard and put that video stream out the Thunderbolt port. Any "add-in" PCI-e video card would pump out video just like they always have. No special custom only to the Mac Pro video card needed.

The other solution is to add another DP switch to the card and run a cable back to the mainboard.
 
Last edited:
The issue is not the main board it is the graphics card board. Putting the TB controller on it (and getting power and PCI-e bandwidth) is a problem.

Bingo. The problem is the graphics board is sitting on a PCI-16x slot. The chipset may have the extra bandwidth for Thunderbolt, but you're still limited to the PCI 16x speeds. (PCI Express 3.0 may change this, not sure.)

There are ways around, like dongles coming off ports on the main board, but that's really the question. How exactly is Apple going to do this?
 
I've been saving some cash for a new Mac Pro.

Would love to welcome the new Xeons, TB and this Radeon 7970!

Come on, Apple. Let 2012 be the year of the Mac Pro! :D (no way in H*ll)
 
I think Lion is the first OSX release I've had zero interest in...




Me too. A shame, really.

Dittoz, +1 x3 and all of that.

It's the one OS upgrade that's been total hype and a redesign but no new features. Leopard at least brought me Time Machine, Snow Leopard cleaned out all of my PPC code, and Lion . . . . . . makes my Mac look like an iPad? No thanks.

I've been saving some cash for a new Mac Pro.

Would love to welcome the new Xeons, TB and this Radeon 7970!

Come on, Apple. Let 2012 be the year of the Mac Pro! :D (no way in H*ll)

Agreed. I've been saving since I sold my 2005 PowerMac G5 and between the recession, moving, and the birth of my son I've managed to keep "most" of that cash on hand in preparation for a new video/photo rig. Even got the biggest desk I could find to hold it all.
 
The issue is not the main board it is the graphics card board. Putting the TB controller on it (and getting power and PCI-e bandwidth) is a problem.
There's two basic approaches that can be taken:
  1. Put the TB chip on the logic board or another PCIe card, and use a cable/flexible PCB connector to get the DP output from the graphics card to the TB chip. Solves any PCIe bandwidth (see this more in the future), power, real-estate, or thermal issues adding a TB chip to the graphics card would create.
  2. Put the TB chip directly on the graphics card. PCB real-estate, power consumption, and thermals are more likely to be an issue, particularly in higher-end cards. Then there's the increased cost, as not all users will want/need TB anyway, and balk at the price increases. As per PCIe bandwidth, that one isn't that big a deal ATM (possibly in the not too distant future), as current high-end cards can only utilize ~10 lanes worth of bandwidth at Gen 2.0. So as things exist, there's really enough PCIe bandwidth available as things exist (saturate ~14 lanes with the GPU and TB <both directions> chips running full bore).
As option #1 isn't on the graphics card, I'm looking at it as a single method with two sub-parts, rather than two distinct methods.

Between the two implementations listed above, I see the trade-offs between locating the TB chip on a separate card/main board (prefer a PCIe card) as the better compromise, and the card sub-part best of all, as it gives the user the option of TB, rather than having an unused TB chip wasting 4x PCIe lanes that could be put to better use.

The core issue here is why there should be anything at all added to PCI-e cards that deliver graphics. TB doesn't add anything to their core mission.
I don't disagree with this at all (never have).

TB should be an option as not everyone needs/wants it, not forced by including the TB chip on graphics cards, whether users want it or not (still consumer oriented, and best suited to laptops/AIO's ATM that don't have slots). Desktop users would most likely balk at the cost increase it creates, and I'd be one of them.

And as you've pointed out, it has limitations as per the DP spec, and everything else is already available in existing cards. I totally agree with this...

TB has little, if anything, to offer the majority of desktop users ATM (i.e. might be desired by a professional with a TB equipped video camera if/when they actually appear in the market).

On the mainboard sure. (on the processor daughtercard is a bit of a hassle. ) But there core problem is that the display port signals on the PCI-e graphics card are coming out of the edge on the card. They don't go to the mainboard. In fact, there isn't really a good reason to send them there.
I presumed those reading the post would realize that it meant the backplane board in the case of the MP (used main board in general, as most systems still use a single board), as any connector would need to be where the PCIe slots are (seemed common sense to me). So I apologize for any confusion. :eek:
 
There's two basic approaches that can be taken:
  1. Put the TB chip on the logic board or another PCIe card, and use a cable/flexible PCB connector to get the DP output from the graphics card to the TB chip. Solves any PCIe bandwidth (see this more in the future), power, real-estate, or thermal issues adding a TB chip to the graphics card would create.
  2. Put the TB chip directly on the graphics card. PCB real-estate, power consumption, and thermals are more likely to be an issue, particularly in higher-end cards. Then there's the increased cost, as not all users will want/need TB anyway, and balk at the price increases. As per PCIe bandwidth, that one isn't that big a deal ATM (possibly in the not too distant future), as current high-end cards can only utilize ~10 lanes worth of bandwidth at Gen 2.0. So as things exist, there's really enough PCIe bandwidth available as things exist (saturate ~14 lanes with the GPU and TB <both directions> chips running full bore).
As option #1 isn't on the graphics card, I'm looking at it as a single method with two sub-parts, rather than two distinct methods.

Between the two implementations listed above, I see the trade-offs between locating the TB chip on a separate card/main board (prefer a PCIe card) as the better compromise, and the card sub-part best of all, as it gives the user the option of TB, rather than having an unused TB chip wasting 4x PCIe lanes that could be put to better use.

I've got a really good feeling that Apple will but the ports on the board, and skip the idea of TB through the GPU altogether. Apple has a hard time getting decent GPUs to the masses as it stands, and from what I hear it's about firmware and drivers. Apple now needing to get drivers and firmware as well as a TB controller on a 3rd party piece of hardware is going to be murder.

It's a bit far-fetched, but I have a feeling Apple will have new series of GPUs, but will toss the TB ports on another part of the case . . . like underneath the FW800 ports or something.
 
I've got a really good feeling that Apple will but the ports on the board, and skip the idea of TB through the GPU altogether. Apple has a hard time getting decent GPUs to the masses as it stands, and from what I hear it's about firmware and drivers. Apple now needing to get drivers and firmware as well as a TB controller on a 3rd party piece of hardware is going to be murder.

It's a bit far-fetched, but I have a feeling Apple will have new series of GPUs, but will toss the TB ports on another part of the case . . . like underneath the FW800 ports or something.
You mean a Data-Only TB solution?

That's an easier approach (no need to worry about getting a DP signal to the TB chip), whether it's on the backplane board or via a PCIe card, but I'm not sure if Apple would follow this approach or not. Particularly as it's not officially sanctioned by Intel yet (expect this will be the case, but not this early as a means of fostering adoption as it eliminates confusion of whether or not the TB port will carry DP data).
 
You mean a Data-Only TB solution?

That's an easier approach (no need to worry about getting a DP signal to the TB chip), whether it's on the backplane board or via a PCIe card, but I'm not sure if Apple would follow this approach or not. Particularly as it's not officially sanctioned by Intel yet (expect this will be the case, but not this early as a means of fostering adoption as it eliminates confusion of whether or not the TB port will carry DP data).

True, never considered that confusion aspect. There would be users plugging their displays into the TB port and getting nothing.

So that pretty much sums up the need for the TB to be on the card, I don't see Apple taking away another PCI lane just for TB. I see so much hurt and pain coming down the line waiting for GPUs with the proper firmware and TB controllers.
 
You mean a Data-Only TB solution?

That's an easier approach (no need to worry about getting a DP signal to the TB chip), whether it's on the backplane board or via a PCIe card, but I'm not sure if Apple would follow this approach or not. Particularly as it's not officially sanctioned by Intel yet (expect this will be the case, but not this early as a means of fostering adoption as it eliminates confusion of whether or not the TB port will carry DP data).

I doubt we will see data-only TB, at least using the name Thunderbolt. That would add nothing else except confusion and would defeat the idea of Thunderbolt being "the only cable you need".

I wouldn't be surprised to see no Thunderbolt at all. It's aimed for consumer devices as you have said it yourself. Consumer desktops all have some sort of integrated graphics nowadays so getting the output on those won't be a problem. There are already faster solutions for workstations/servers, so TB would be fairly obsolete anyway.
 
ThunderBolt is a solution for add-in cards that the Mac Pro does not need. Well unless they want to keep in within the product feedback loop.

You might be looking at Apple-only video cards if you want to have ThunderBolt support.
 
I doubt we will see data-only TB, at least using the name Thunderbolt. That would add nothing else except confusion and would defeat the idea of Thunderbolt being "the only cable you need".

I wouldn't be surprised to see no Thunderbolt at all. It's aimed for consumer devices as you have said it yourself. Consumer desktops all have some sort of integrated graphics nowadays so getting the output on those won't be a problem. There are already faster solutions for workstations/servers, so TB would be fairly obsolete anyway.

Apple won't see it that way though.
 
So that pretty much sums up the need for the TB to be on the card, I don't see Apple taking away another PCI lane just for TB.
Generally speaking, that's how I look at it.

Apple OTOH, may not, and decide that they will provide the GPU signal over TB. How they'll implement it, is open to debate (last I checked, they're still selling the MDP monitor versions). Particularly if there's no standard developed in time between Intel and GPU card makers to get a DP signal to the TB chip. Even licensing fees could be an issue in the case of Apple (look at their approach to HDMI - it's only available on a couple of products).

I doubt we will see data-only TB, at least using the name Thunderbolt. That would add nothing else except confusion and would defeat the idea of Thunderbolt being "the only cable you need".

I wouldn't be surprised to see no Thunderbolt at all. It's aimed for consumer devices as you have said it yourself. Consumer desktops all have some sort of integrated graphics nowadays so getting the output on those won't be a problem. There are already faster solutions for workstations/servers, so TB would be fairly obsolete anyway.
It's possible they could use another moniker, Cannon has already announced plans for Thunderbolt equipped video cameras. So there could be confusion that way as well...

ATM, it's definitely a consumer product (aimed at laptops and AIO's without slots). There are professionals that could take advantage of it by using TB cards for their systems in order to share peripherals used with their laptops or importing camera data (i.e. plug the video camera into the main computer or a Pegasus R4/6 box of recorded data recorded through a laptop from a location shoot).

Definitely a potential mess either way I think, and I'm not sure Intel has figured out their exact approach yet as they have time to do so.

ThunderBolt is a solution for add-in cards that the Mac Pro does not need. Well unless they want to keep in within the product feedback loop.

You might be looking at Apple-only video cards if you want to have ThunderBolt support.
There are reasons for having it in a desktop for professional use, but it's going to be a sub-set of users.

Apple won't see it that way though.
I suspect you may be right on this, particularly as they were part of the development (software portion, as early demos were on OS X).
 
There are reasons for having it in a desktop for professional use, but it's going to be a sub-set of users.
I have nothing against the Mac Pro itself. It is just a pain to try to implement ThunderBolt without a Xeon with an IGP onboard for video outputs. Switching that just requires the use existing pathways. I think we will all want to see the block diagrams if Apple manages to squeeze this onto the Mac Pro.

Otherwise some custom PCB video cards are going to be needed.
 
I have nothing against the Mac Pro itself.
I didn't think you did. :p

It is just a pain to try to implement ThunderBolt without a Xeon with an IGP onboard for video outputs. Switching that just requires the use existing pathways. I think we will all want to see the block diagrams if Apple manages to squeeze this onto the Mac Pro.

Otherwise some custom PCB video cards are going to be needed.
An IGP isn't an absolute necessity though. Just get DP output from a separate GPU to the TB chip, wherever it's located (separate TB card, GPU card, or the backplane board).
 
It's possible they could use another moniker, Cannon has already announced plans for Thunderbolt equipped video cameras. So there could be confusion that way as well...
Not really. Cameras are really not much different than a hard drive. They are peripherals. There is no issue with the peripherals being "data only". It probably would not be "display port" that is coming over the wire. What you'd be talking to is the "SATA like" device hooked to the PCI-e bus.

In this case, TB would be supplementing and/or replacing the USB/Firewire ports that were previously on the camera. It is just data that happens to represent video/audio.

It is not likely a "processed" video out data stream . But either raw data from the sensor or a compressed stream ( i.e., camera write to a "special" file that is really just a named pipe back to the computer.)

High end tethered shooting with a camera like RedOne/Scarlet would require something in the range of a 200MB/s substitute for the RedMag drive.

Targeting the video out on the camera ( replacing the HDMI or mini-HDMI outputs ) really doesn't make lot of sense to do with TB. If you going to hook your camera to a reference monitor why engage the hocus pocus of converting to TB flavor of Display Port and they have to add a decrypt in the monitor? Similarly, cameras don't need (or particularly want) external drives or drive arrays (i.e., the camera is the 'top of food chain' with peripherals. ).
 
Last edited:
Not really. Cameras are really not much different than a hard drive. They are peripherals.
I realize this.

I'm talking about familiarity/psychology in regard to marketing, as people have already seen articles that tie the Thunderbolt moniker to future video cameras. Granted, we're not talking about a full blown marketing campaign from Cannon or other camera makers yet, but the seed is already there from articles that have already been published.

Seems as if they could have a battle on their hands by trying to change the interconnect's product name to reflect a data only solution. I'm not saying it's impossible, but I can image more than a few threads of those wondering what's going on once products reach users' hands if it were to happen. Which could ultimately translate to a negative effect on early sales figures.

There's the potential for a mess either way, but I'm thinking that keeping the current moniker (perhaps followed by a letter code to distinguish the variants; i.e. Thunderbolt-DO for the Data Only variant) would be less problematic.
 
...Thunderbolt-DO...

bandb.jpg



Heh...hehehe...heh...

Thunderbolt do...

He said doo...
 
I'm talking about familiarity/psychology in regard to marketing, as people have already seen articles that tie the Thunderbolt moniker to future video cameras.

I wouldn't worry about that much. This is likely only mid to high end "pro" cameras where really don't have to worry about bubba joe getting easily confused.

On most cameras where the edge space is limited, USB 3.0 will likely get the nod instead of Thunderbolt. It is more ubiquitous along with HDMI .


There's the potential for a mess either way, but I'm thinking that keeping the current moniker (perhaps followed by a letter code to distinguish the variants; i.e. Thunderbolt-DO for the Data Only variant) would be less problematic.

For a name Thunderbolt-D would be better. However, "micro Thunderbolt" might work better. The "Data only" variant could just drop some pins and be smaller. The physical change should make it easier to folks to comprehend that it is slightly different. Dropping channel(s) in the cable should make it similar to how there are micro USB to "full" USB cables now.

"micro Thunderbolt" devices could be restricted to just having one connector. They go at the "end" of a TB daisy chain because they don't "do" the propagating Display Port data thing. The upcoming "Port Ridge" controller only has 1 TB channel.

http://arstechnica.com/gadgets/news...lt-controller-could-broaden-reach-of-spec.ars

Intel could have people hook that to a four physical channel connector. However, it would be a better match to expectations if were not using a connector where 3 of the channels are "dead". A smaller connector that can only do one channel would be a closer match. Restrict its use to "non PCs or monitors " (it has to go on peripherals) and that shouldn't bother the Display Port folks much.


The slippery slope would be allowing it onto add-on PCI-e cards so that "boxes-with-slots" could jump in. The marketing problem is that doesn't push forward the standard DisplayPort connector agenda of become standard on PCs.


The only "Marketing message" that would need to be modified is the "Thunderbolt is the one physical port & cable to rule them all". That was always a farce. It is a even bigger farce is they continue to push that silliness after it is initially deployed and it is obvious that is not true.
 
I wouldn't worry about that much. This is likely only mid to high end "pro" cameras where really don't have to worry about bubba joe getting easily confused.

On most cameras where the edge space is limited, USB 3.0 will likely get the nod instead of Thunderbolt. It is more ubiquitous along with HDMI .
Quite possible.

Though given how quickly the announced that there would be TB equipped cameras, I suspect they'll at least include it in their upper most consumer model as well along with a media blitz to generate attention and ultimately sales. Not sure how effective it will be in terms of increasing the bottom line, as professionals will go for it or not whether a media campaign exists or not as you mention. But it could help with the consumer lines (not sure how well they're currently doing in this segment).

For a name Thunderbolt-D would be better. However, "micro Thunderbolt" might work better.
Thunderbolt-D would work better IMO.

I didn't put a lot of thought into the name as I'm horrible at it (but inserted something to illustrate the point). ;)

The "Data only" variant could just drop some pins and be smaller. The physical change should make it easier to folks to comprehend that it is slightly different. Dropping channel(s) in the cable should make it similar to how there are micro USB to "full" USB cables now.

"micro Thunderbolt" devices could be restricted to just having one connector. They go at the "end" of a TB daisy chain because they don't "do" the propagating Display Port data thing. The upcoming "Port Ridge" controller only has 1 TB channel.
This may be what they go for in the end.

In terms of costs, it's cheaper to stick with one interconnect (at least in the beginning, as adoption rates are still low). Confusion could happen either way (single interconnect used for both, or a miniaturized variant) due to users not putting in the research time to get the correct cable.

Intel could have people hook that to a four physical channel connector. However, it would be a better match to expectations if were not using a connector where 3 of the channels are "dead". A smaller connector that can only do one channel would be a closer match. Restrict its use to "non PCs or monitors " (it has to go on peripherals) and that shouldn't bother the Display Port folks much.
I definitely see the single channel variant as a data peripheral interconnect (single HDD/SSD, camera, ...).

As per different cabling however, that would only be on one end in order for it to be used with the full TB port (i.e. those using TB for a monitor connection). And in such cases, it's possible users may not put in the time to learn the smaller connector needs to be at the end of the chain, resulting in posts asking questions as to why things aren't working properly and/or rants.

The slippery slope would be allowing it onto add-on PCI-e cards so that "boxes-with-slots" could jump in. The marketing problem is that doesn't push forward the standard DisplayPort connector agenda of become standard on PCs.
This is where I see such an implementation being most important though, as such systems already have a means of connecting to the monitor, but they are interested in using TB peripherals. Particularly if they're able to share them with laptops (i.e. pros doing field work with the laptop, such as site recording, then edit on the workstation in the office).

The only "Marketing message" that would need to be modified is the "Thunderbolt is the one physical port & cable to rule them all". That was always a farce. It is a even bigger farce is they continue to push that silliness after it is initially deployed and it is obvious that is not true.
Definitely a farce, and that's what will almost certainly come back and bite them in the you-know-what as they have continued to push this message.
 
Though given how quickly the announced that there would be TB equipped cameras, I suspect they'll at least include it in their upper most consumer model as well along with a media blitz to generate attention and ultimately sales.

I think folks like Cannon said that TB was a promising technology; stuff like "excited by the possibilities". I haven't seen high end camera vendor say that it is a definite feature on a specific product yet. Bluntly, its usefulness is limited due to it is still the case that only one major system vendor is selling TB equipped computers (sony's hack doesn't really count). Most of the vendors recently dropped Ultrabooks without it. We'll see if some show up at CES in a couple of weeks. It there aren't 2-3 vendors with at least "early peek" at models scheduled for production ..... it is going to be a while.

At the Feb 24th introduction, Intel had an Avid spokesperson who was excited about the technology. No products yet. There are many vendors waiting on the 2nd Generation controllers. I think there will be an uptick then but TB is never going to catch USB 3.0 in deployment.



But it could help with the consumer lines (not sure how well they're currently doing in this segment).

Consumer lines don't have a need for the bandwidth demand. They can easily pull HD video files through USB 3.0 with the bonus if they have to use a USB 2.0 port it just goes slower. How many TB ports on computer are out there? It is a dismally small percentage still (talking deployed in use worldwide.... not US retail sales for last 9 months. ). Is there a recently designed (not cheap retreads of designs/parts from 2 years ago) Windows PC that doesn't have USB 3.0?

Particularly if they're able to share them with laptops (i.e. pros doing field work with the laptop, such as site recording, then edit on the workstation in the office).

How many desktops come standard equipped with PCMCIA and/or ExpressCard slots? I'm sure someone can dig a couple up, but this isn't a standard feature. Doesn't point to there be a huge, latent overlap market out there.
 
While DO Thunderbolt sounds good, the problem from Apple's end (and certainly a problem if you look at the Mac Pro being in danger) is that Apple has to keep producing the 27" Cinema Display (the MDP version) specifically for the Mac Pro.
 
The problem from Apple's end (and certainly a problem if you look at the Mac Pro being in danger) is that Apple has to keep producing the 27" Cinema Display (the MDP version) specifically for the Mac Pro.

Not a problem. Apple stopped making monitors several years ago when the stopped selling the 30" 23", and 20" monitors. The primary function of the 27" MDP Cinema Display is as a docking station. Even if the Mac Pro disappeared its primary mission would be unaffected. The MDP version's problem is that the number of legacy MB/MBP/MBA etc users that need a display who haven't bought one is going to be on the decline.

Look at any of the Mac Pro Threads here where someone asks "What's best/good/etc monitor for Mac Pro". The Cinema Display are not the most popular. Even more so if l narrow the responses to those who have bought and are commenting on a monitor from the last 2-3 years.

Tail wagging the dog ... TB Cinema Display (i.e., docking station with a glossy LCD panel in it) sales are not going to be dependent in any significant way on Mac Pro sales even if put a TB port on a Mac Pro. That's backwards.

Likewise Mac Pro sales would not significantly slow if Apple didn't sell a Display Port monitor. It is like saying Apple sales would significantly slow if they stopped selling printers.... didn't happen. Won't happen after folks wake up and see they exited the pure monitor business either.
 
Last edited:
having done a lot of testing of the lacie t-bolt and the promise pegasus t-bolt units. the lacie with 2 ssds is worthwhile piece of gear. Costs too much and the fan is loud but it is very stable as a boot device.

I also built a fw800 to dc cable that allow it to be powered as a portable device. It also runs without the fan if you use samsung 256gb series 470 ssds. Based on my testing a pair of t-bolt plugs on a new mac pro would be a decent addition.

the lacie device needs a few mods and lower price to be really good but it has great promise as a really fast boot device that can be moved from machine to machine with ease.

Not to knock the promise pegasus as it is a better device then the lacie in many ways.

For mac pro users it is not needed. sans digital has plenty of lower cost raid solutions along with a few other 8 bay raid providers.
 

Attachments

  • Screen Shot 2011-12-21 at 9.36.23 PM.jpg
    Screen Shot 2011-12-21 at 9.36.23 PM.jpg
    268.7 KB · Views: 78
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.