Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

When do you expect an iMac redesign?

  • 4rd quarter 2019

    Votes: 34 4.1%
  • 1st quarter 2020

    Votes: 23 2.8%
  • 2nd quarter 2020

    Votes: 119 14.5%
  • 3rd quarter 2020

    Votes: 131 15.9%
  • 4rd quarter 2020

    Votes: 172 20.9%
  • 2021 or later

    Votes: 343 41.7%

  • Total voters
    822
  • Poll closed .

askunk

macrumors 6502a
Oct 12, 2011
547
430
London
We may be surprised by what the XDR cooling modified for the iMac could do. I mean if they redesign the cooling system increasing the airflow drastically by using the same holes used on its back (or on the front panel of the MP). :)
 

pldelisle

macrumors 68020
May 4, 2020
2,248
1,506
Montreal, Quebec, Canada
Can't wait for the iMac to be this thin!

I imagine a 5700 XT would have to be underclocked to the moon and back to fit in there ;)

With sufficient heatsink and air flowing in it I don't see why it would suffer from such an underclock. It will probably not be at the same clock as the desktop part, but might not impact performance that much too.
[automerge]1591733252[/automerge]
Someone has to be guinea pig so go ahead.
Meanwhile, we will all enjoy intel whilst you wait for developers to catch up. :-D
Always liked to try the newest things :) And as a developer, would be a lot trigged about working on this new platform.
 
Last edited:

pldelisle

macrumors 68020
May 4, 2020
2,248
1,506
Montreal, Quebec, Canada
Thank you for the link. I'll check that out. I need that rotate. It's so I can see 'A3' comic art in portrait mode. (I've just searched on Amazon.co.uk. I've booked marked it. £544. Not bad at all.)

You won't be disappointed. Dell makes really good monitors. I've always been on Dell since the last 10 years, at home just like at office.
 

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
Thank you for the link. I'll check that out. I need that rotate. It's so I can see 'A3' comic art in portrait mode. (I've just searched on Amazon.co.uk. I've booked marked it. £544. Not bad at all.)

Azrael.
Will the mighty Azrael share his art?

Side note - is it not better to wait for the iMac to be announced before purchasing a secondary monitor? Its form factor and/or size my influence your decision?
 

askunk

macrumors 6502a
Oct 12, 2011
547
430
London

"Right now, today, Apple could deliver high-end performance with workstation chips similar to the Altra and ThunderX3 in the same thermal and power envelopes that the iMac Pro and Mac Pro use now with the Xeon processor. The software would have to follow — which is why that transition probably won't be day and date with the lower-end models."
 

Moonjumper

macrumors 68030
Jun 20, 2009
2,747
2,935
Lincoln, UK
Long time reader, first time poster...

Q: Is it that much of a stretch to think that the new ARM-based iMac and the redesigned iMAC aren't one-in-the-same?

I think Apple has been preparing for this transition for years. Xcode and the compilation toolchain has improved so radically in the last few years I bet they've had macOS running on it long enough to have already thoroughly tested it. They'll surely have an emulation layer for apps that have not yet been cross-compiled.

It just doesn't make sense to release a total redesign that is outdated in 6-12 months with a subsequent CPU transition.

I think we're going to get it all in one event. Hold on!!

I expect it will be next year before there is a consumer Arm Mac as it needs to be time for developers to port their apps. A new iMac is needed now.

One remote possibility is a consumer iMac with Intel, and a Developer Transition Kit that is an iMac with Arm in exactly the same body. The message being nothing will be lost in the transition. Even more remote is releasing an iMac with both Intel and Arm inside. Works now, works after the transition. But that would be expensive to do, and there isn't the space.
 

askunk

macrumors 6502a
Oct 12, 2011
547
430
London
Side note - is it not better to wait for the iMac to be announced before purchasing a secondary monitor? Its form factor and/or size my influence your decision?

I would wait. After all, if it's true, it's about a couple of weeks. ;)

I have a DELL U25H15n 25" 1440p with very thin bezels. So thin they will fit well aside my new iMac :D
I bought it for about 150£ something like 3-4 years ago.
Perhaps Dell came out with newer ones and surely they do bigger ones, but I agree with Azrael, they are generally worth the money.
 

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
I would wait. After all, if it's true, it's about a couple of weeks. ;)

I have a DELL U25H15n 25" 1440p with very thin bezels. So thin they will fit well aside my new iMac :D
I bought it for about 150£ something like 3-4 years ago.
Perhaps Dell came out with newer ones and surely they do bigger ones, but I agree with Azrael, they are generally worth the money.
If it's not announced at WWDC, I am finding a nearby bridge. And a high one at that.

Not a bad deal at all! Even better if it is an IPS panel.
I switched my 27in 4k LG IPS monitor for a 32in LG VA monitor. The shift in colour was a bit questionable at first, but I don't notice it anymore. I don't do any photo/video editing so accurate colours are not needed. Extra screen real estate please!

I run a 32in 4K monitor, with two 22in 1080p monitors stacked next to it (usually with Outlook, Spotify or a YT video playing). Multiple monitors is very handy when coding, so I can refer to all my sources and notes (started learning Swift 2 months ago...)
 

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
You should learn Swift 5 ! ;) It's the latest stable release. Ahaha.

I had to give my girlfriend a display for her home office. Had the MacBook Pro + 2 Dell, now MacBook Pro + 1 Dell. What a difference. It has been really hard. You adapt yourself to these kind of things, it's crazy !
I know diddly squat, so probably don't make use of any features introduced in version 3, 4 or 5! Do you know Swift at all? I know several people have mentioned they code on here, but I cannot recall who...

Ooft, that is a nasty adjustment. I suppose I can consider myself lucky. No gf in sight (too busy refreshing MacRumours, 9to5Mac, and /r/Apple for that)
 

pldelisle

macrumors 68020
May 4, 2020
2,248
1,506
Montreal, Quebec, Canada
I know diddly squat, so probably don't make use of any features introduced in version 3, 4 or 5! Do you know Swift at all? I know several people have mentioned they code on here, but I cannot recall who...

Ooft, that is a nasty adjustment. I suppose I can consider myself lucky. No gf in sight (too busy refreshing MacRumours, 9to5Mac, and /r/Apple for that)
You should definitively switch to 5. A lot of breaking changes at each major versions of the language. It's been binary stable only since version 5. So there is a lot of benefits to learn directly v5.

I coded in swift back in v2. Not big projects. Now I'm into Python and deep learning. Been 2-3 years I strictly code in Python only. Love this language. Did a lot of Java too, C/C++, parallel APIs like CUDA, OpenMP, MPI, a bit of Elixir too, and ruby (sh*tty language)
 
  • Wow
Reactions: gusping

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
You should definitively switch to 5. A lot of breaking changes at each major versions of the language. It's been binary stable only since version 5. So there is a lot of benefits to learn directly v5.

I coded in swift back in v2. Not big projects. Now I'm into Python and deep learning. Been 2-3 years I strictly code in Python only. Love this language. Did a lot of Java too, C/C++, parallel APIs like CUDA, OpenMP, MPI, a bit of Elixir too, and ruby (****** lang)
I was joking, don't worry. I am using v5, haha. I even read (to some extent) the 5.2 iBook on the key/fundamentals (classes, loops, functions, protocols etc). Thought I was doing really well, then tried to follow some basic app tutorials, and realised why I never started all those years ago.

How can programming be so f**king hard, yet so many people can do it? Easily the single hardest thing I have ever had to learn. I am trying to do a little bit as often as possible, and not to give up. It'll be worth it in a few years, when I am hopefully semi-decent, haha (depending on whether I jumped off a bridge before then!).

Damn, you are a crazy fool. I wish I knew a tenth of what you do. Do you code in your day job?
[automerge]1591737469[/automerge]
I don't think it is but it has a very wide angle of view.
That's alright then, plus £150 is a good price, especially 3-4 years ago.
 
  • Like
Reactions: askunk

askunk

macrumors 6502a
Oct 12, 2011
547
430
London
How hard is to move from C to Swift? (sorry for the OT)
[automerge]1591737887[/automerge]
P.S.- on the 11th apparently Sony is introducing the PS5. We might get a peek at the RDNA2 chip.
 

pldelisle

macrumors 68020
May 4, 2020
2,248
1,506
Montreal, Quebec, Canada
Once you know a language, it's easy to learn another one. Swift is one of the easiest language to learn with Python I personally find. C was the first language I learned back almost 11-12 years ago. When I learned Swift it was a walk in the park, many years passed by the time I began.
 
  • Like
Reactions: askunk

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
How hard is to move from C to Swift? (sorry for the OT)
[automerge]1591737887[/automerge]
P.S.- on the 11th apparently Sony is introducing the PS5. We might get a peek at the RDNA2 chip.
Exciting stuff, although it is all about the XSX for me. I need 12 teraflops! Still tempted to build a gaming PC, but I don't know if I have the patience to mess about with all the drivers, additional settings etc. Plug and play with the specs of the Series X seems very nice.
[automerge]1591738445[/automerge]
Once you know a language, it's easy to learn another one. Swift is one of the easiest language to learn with Python I personally find. C was the first language I learned back almost 11-12 years ago. When I learned Swift it was a walk in the park, many years passed by the time I began.
Alright, alright, alright. No need to bang on about it ;)

Any tips for a pleb (first time learning a programming language)? I find the fundamentals ok (for the most part), but the actual app logic/structure is a wholeeeee other ball game, not to mention the UI. Have I bitten off more than I can chew by attempting to learn Swift? Probably...
 

pldelisle

macrumors 68020
May 4, 2020
2,248
1,506
Montreal, Quebec, Canada
I need 12 teraflops

That's cute ;)

I currently have 45 Tesla V100 32 GB computing for me. That's 630 TFLOPs at 32-bit single precision. Or 13.5 KWh. lol
[automerge]1591738759[/automerge]
Any tips for a pleb (first time learning a programming language)? I find the fundamentals ok (for the most part), but the actual app logic/structure is a wholeeeee other ball game, not to mention the UI. Have I bitten off more than I can chew by attempting to learn Swift? Probably...

No. It's easy to learn.

Must read Clean Code (https://www.amazon.ca/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882) It's the bible of any good programer.

I personally learn a lot by example. Maybe you must just find the kind of Swift book that fits yourself better.
 
  • Like
Reactions: askunk and wardie

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
That's cute ;)

I currently have 45 Tesla V100 32 GB computing for me. That's 630 TFLOPs at 32-bit single precision. Or 13.5 KWh. lol
[automerge]1591738759[/automerge]


No. It's easy to learn.

Must read Clean Code (https://www.amazon.ca/Clean-Code-Handbook-Software-Craftsmanship/dp/0132350882) It's the bible of any good programer.

I personally learn a lot by example. Maybe you must just find the kind of Swift book that fits yourself better.
That's some deep deep deep learning going on. Although for gaming, I am sure an RTX 2080 Ti runs rings around those 630 TFLOPs ;) What kind of stuff do you apply deep learning to? Love hearing about examples of this kind of stuff.

If you say so! Thanks, I will take a look.
 

gusping

macrumors 68020
Mar 12, 2012
2,020
2,307
I apply deep learning to medical images. Domain adaptation, image normalization. I'm into this.

One single computation takes 5 days to complete. 5 days on a Tesla V100 32 GB to get an image.

Gaming is a waste of time.
**** me, that is a long time. Let's say you had a standard high-end off the shelf GPU, i.e. RTX 2080, how long are we talking, a month?

I will let that slide, and will not prod the hornet's nest ;)
 

pldelisle

macrumors 68020
May 4, 2020
2,248
1,506
Montreal, Quebec, Canada
**** me, that is a long time. Let's say you had a standard high-end off the shelf GPU, i.e. RTX 2080, how long are we talking, a month?

I will let that slide, and will not prod the hornet's nest ;)

I have two RTX 2080 in my personal server at home. To reach the same amount of iterations, it's easily 3-4 times longer. So count something like 15-20 days yeah. With the NVLINK I can use them both on the same problem in a distributed manner, so it's a bit better. Otherwise, it's directly proportional to the amount of VRAM the board has.
If there would be 1 TB GPU, I would fill its RAM completely.That would unlock the possibility to create more complex networks and being able to learn from a lot more data.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.