Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

JouniS

macrumors 6502a
Nov 22, 2020
638
399
The way I see it is because Linux is free and most scientific computing tasks are done by academics where budgets are tight. I’m not so sure for commercial scientific projects tho. Those probably value support and uptime rather than the ease of tinkering provided by commodity equipments.
Linux is the default operating system in many fields, both in the academia and the industry. It's free, which means that it does what you want. You are not at the mercy of the bureaucrats of Microsoft, who keep introducing new features to justify their continued employment, breaking random things in the process.
 
  • Like
Reactions: Oculus Mentis

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
It's unclear why Apple should be the one to invest in every side project that strikes the internet's fancy.
Molten Vulkan is coming along fine. HipSYCL is coming along fine.
It's up to Apple to decide which 3rd party projects of this type to support (WebGL? Blender? PyTorch?) Unless you have data to show why the cases they have supported vs the cases they have not were badly chosen, I don't think there's much useful to say.

Otherwise where does it end? Should Apple be supporting TeX? Julia? R? FFTW? Everyone can come up with their own pet project...
I think there’s projects Apple should support to keep their core demographics, (i.e. academic, education, and creative) and people want that.

Take their recent attempts with Blender for example. 3D rendering is one “creative” area which could use a lot of work. So they contribute to it in hopes that it will reflect positively on their hardware to 3D modelers.

There’s also a lot of emerging technology that we have no clue what will happen. So, people naturally are throwing **** at the wall and seeing what sticks.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
If Apple adopted more open standards, it wouldn't need to fund so many projects. For example, Apple has invested in the Blender viewport because it does not support OpenGL 4.3+ and Vulkan. Neither AMD, Intel, nor Nvidia have had to do so, because they support those standard APIs.
Well if it would save them money and allow them to run more programs natively, then why didn’t they go that route?

There’s likely a factor that’s not being taken into account here. Clearly the decision makers have some reason to roll their own, rather than going with the obvious solution.

As much as I like open standards in theory, sometimes they aren’t the best solution for everyone. And without all factors taken into account, we can’t really say whether it’s the best course of action.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
Linux is the default operating system in many fields, both in the academia and the industry. It's free, which means that it does what you want. You are not at the mercy of the bureaucrats of Microsoft, who keep introducing new features to justify their continued employment, breaking random things in the process.
Sure, but relying on Linux for mission critical applications sometimes maybe not be wise.

That‘s why we have so many different solutions catering for different problems. Use the best option for what you need.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
Clearly the decision makers have some reason to roll their own, rather than going with the obvious solution.
I can understand why companies like Apple, Nvidia or Microsoft want to impose their standards. Proprietary standards give companies a competitive advantage, and in the long run it pays off. So most companies try to impose them, but only a few succeed.

we also have examples of closed standards, which are well managed and successful.
Could you give a couple of examples? Only licensable standards like Arm ISA or FRAND patents come to mind, and I am no longer sure about Arm ISA.
 

Numa_Numa_eh

Suspended
Jun 1, 2023
87
105
I can understand why companies like Apple, Nvidia or Microsoft want to impose their standards. Proprietary standards give companies a competitive advantage, and in the long run it pays off. So most companies try to impose them, but only a few succeed.


Could you give a couple of examples? Only licensable standards like Arm ISA or FRAND patents come to mind, and I am no longer sure about Arm ISA.
DirectX, Cuda.
 
  • Like
Reactions: Xiao_Xi

name99

macrumors 68020
Jun 21, 2004
2,407
2,308
Well if it would save them money and allow them to run more programs natively, then why didn’t they go that route?

There’s likely a factor that’s not being taken into account here. Clearly the decision makers have some reason to roll their own, rather than going with the obvious solution.

As much as I like open standards in theory, sometimes they aren’t the best solution for everyone. And without all factors taken into account, we can’t really say whether it’s the best course of action.
You have to remember that the assertions about FOSS and similar issues ("open" standards and so on) are mostly made by young people with little serious technical knowledge AND little serious historical experience. They are statements of affinity, not of technical evaluation.
You can easily see this when you look at the details. Demands for "standards"... except AV-1 (corporate controlled) is considered a standard whereas h.266 (international standard) does not meet the cut. Similarly with regard to CUDA, or x86, apparently these are not corporate controlled in the eyes of some people! Or web standards. Or language standards. Wild claims about how Linux is on the edge of some sort of break out (soon to be followed by RISC-V). etc etc.
NONE of this is based in any sort of fact-based analysis of what's best for Apple, what's best for the world, how the economics work out, etc etc. There are adults writing such analyses – but they tend to be found in the (non-popular) financial press, not ranting on the internet.

You can't use rationality to argue someone out of a position they fell into non-rationally. All you can do is ignore them and hope (true maybe 10% of the time) that two or three decades of growing up will give them a clearer sense of reality.
 
  • Like
Reactions: thenewperson

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
analysis of what's best for Apple, what's best for the world, how the economics work out, etc etc.
It is in Apple's best interest to develop proprietary standards when it can gain a competitive advantage and to adopt open standards when it can't.

The best thing for the world are standards, whether they are closed or open, free or paid, it doesn't matter. It sucks to go to another country and see that they drive the other way, speak another language, or use another type of plug/system of measurement.
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
I can understand why companies like Apple, Nvidia or Microsoft want to impose their standards. Proprietary standards give companies a competitive advantage, and in the long run it pays off. So most companies try to impose them, but only a few succeed.

Unfortunately, it’s not that simple with open standards either. These are bs led by interested groups and individuals which try to push them in a certain direction. You either end up with a compromise that sucks for everyone or a system that benefits a particular group.

Could you give a couple of examples? Only licensable standards like Arm ISA or FRAND patents come to mind, and I am no longer sure about Arm ISA.

CUDA, DirectX, Cocoa, Thunderbolt, (LP)DDR, ARM…
 

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
You have to remember that the assertions about FOSS and similar issues ("open" standards and so on) are mostly made by young people with little serious technical knowledge AND little serious historical experience. They are statements of affinity, not of technical evaluation.
You can easily see this when you look at the details. Demands for "standards"... except AV-1 (corporate controlled) is considered a standard whereas h.266 (international standard) does not meet the cut. Similarly with regard to CUDA, or x86, apparently these are not corporate controlled in the eyes of some people! Or web standards. Or language standards. Wild claims about how Linux is on the edge of some sort of break out (soon to be followed by RISC-V). etc etc.
NONE of this is based in any sort of fact-based analysis of what's best for Apple, what's best for the world, how the economics work out, etc etc. There are adults writing such analyses – but they tend to be found in the (non-popular) financial press, not ranting on the internet.

You can't use rationality to argue someone out of a position they fell into non-rationally. All you can do is ignore them and hope (true maybe 10% of the time) that two or three decades of growing up will give them a clearer sense of reality.
I'd be curious to hear your thoughts on Linux in the server space.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
I can understand why companies like Apple, Nvidia or Microsoft want to impose their standards. Proprietary standards give companies a competitive advantage, and in the long run it pays off. So most companies try to impose them, but only a few succeed.
That’s not what I meant.

A company or individual would have reasons to go with a proprietary standard rather than their own if their own needs aren’t served by the standard or committee that makes it. (Beyond just having a monopoly over their standard that is.)

You have to remember that the assertions about FOSS and similar issues ("open" standards and so on) are mostly made by young people with little serious technical knowledge AND little serious historical experience. They are statements of affinity, not of technical evaluation.
You can easily see this when you look at the details. Demands for "standards"... except AV-1 (corporate controlled) is considered a standard whereas h.266 (international standard) does not meet the cut. Similarly with regard to CUDA, or x86, apparently these are not corporate controlled in the eyes of some people! Or web standards. Or language standards. Wild claims about how Linux is on the edge of some sort of break out (soon to be followed by RISC-V). etc etc.
NONE of this is based in any sort of fact-based analysis of what's best for Apple, what's best for the world, how the economics work out, etc etc. There are adults writing such analyses – but they tend to be found in the (non-popular) financial press, not ranting on the internet.

You can't use rationality to argue someone out of a position they fell into non-rationally. All you can do is ignore them and hope (true maybe 10% of the time) that two or three decades of growing up will give them a clearer sense of reality.
While I agree that many people ignore the unknown unknowns in Internet arguments, I don’t think it’s particularly relevant.
 

MRMSFC

macrumors 6502
Jul 6, 2023
371
381
Sure, but relying on Linux for mission critical applications sometimes maybe not be wise.
The nice thing about Linux is that you can strip the garbage you don’t want (i.e. stuff that could break)

Ubuntu is used at the lab I work at, where Linux is used, and frankly it’s less of a pain than Windows.
That‘s why we have so many different solutions catering for different problems. Use the best option for what you need.
This, I believe is the real answer.

The world is messy, and you can’t simplify problems to a degree that one giant standard will suffice.
 
  • Like
Reactions: quarkysg

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
A company or individual would have reasons to go with a proprietary standard rather than their own if their own needs aren’t served by the standard or committee that makes it.
A company doesn't develop a standard and make it proprietary because it can't find one that suits it, but because it gives it a competitive advantage. A company can develop a new standard and make it available to others, as AMD did with Mantle (later Vulkan) or Apple did with OpenCL.

These are bs led by interested groups and individuals which try to push them in a certain direction. You either end up with a compromise that sucks for everyone or a system that benefits a particular group.
I can understand that developing a standard when some of its members have a hidden agenda must be a nightmare, but I want to believe that most groups that develop standards don't have those problems.

relying on Linux for mission critical applications sometimes maybe not be wise.
You would use Linux if you had mission-critical applications because commercial Linux distribution provides support for 10 years. What other operating system can give you that?
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
I can understand that developing a standard when some of its members have a hidden agenda must be a nightmare, but I want to believe that most groups that develop standards don't have those problems.

It doesn’t even need to be a hidden agenda. Just pragmatic interests in a committee. I have two anecdotes that might be relevant.

1) Nvidia was the company that probably opposed modernization the most. Why? Because a) they had resources to hand-optimize drivers for apps and a wide range of popular proprietary OpenGL extensions. For them, making something new would be not worth the hassle and diminish their performance advantage. While other companies wanted to change things simply because maintaining high-performance OpenGL drivers is a nightmare.

2) Rust development being financed by Mozilla, which means that features Mozilla cares about most were prioritized. They rushed the async implementation without stabilizing or properly developing the underlaying language concepts, while important aspects of the core language remained neglected. While smaller, more focused projects led by passionate individuals or groups (like Zig) were able to develop a more streamlined and consistent vision.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
It is in Apple's best interest to develop proprietary standards when it can gain a competitive advantage and to adopt open standards when it can't.
Not exactly true. Many times, it is because the standards cannot fulfill what Apple needed at the time. When a suitable standards emerged, Apple is already well on it's way.

I think you have to realise that for profit organisation is it in to make profit, not to make the world a better place. It's just that I think what Apple is doing overall is better compared to the rest of its competition.
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
You would use Linux if you had mission-critical applications because commercial Linux distribution provides support for 10 years. What other operating system can give you that?
Again, I would say it depends.

And I'm saying as someone who likes to tinker with Linux's kernel, building WiFi router firmware with my own custom enhancement to the kernel drivers. Couple the ever morphing kernel with the vast array of Linux distributions, it is very hard for any company to provide good support. And you cannot get commercial support in some parts of the world.

Linux allows infinite tinkering, but to deploy it at scale requires a lot of lights out support that does not require an army of top notch engineers. Many data centers are fielded by folks who only knows how to run canned scripts.

There is a reason why Linux has not taken over the desktop in 2023.
 
  • Like
Reactions: Kazgarth

altaic

macrumors 6502a
Jan 26, 2004
711
484
Again, I would say it depends.

And I'm saying as someone who likes to tinker with Linux's kernel, building WiFi router firmware with my own custom enhancement to the kernel drivers. Couple the ever morphing kernel with the vast array of Linux distributions, it is very hard for any company to provide good support. And you cannot get commercial support in some parts of the world.

Linux allows infinite tinkering, but to deploy it at scale requires a lot of lights out support that does not require an army of top notch engineers. Many data centers are fielded by folks who only knows how to run canned scripts.

There is a reason why Linux has not taken over the desktop in 2023.
I’m familiar with mission critical embedded and server software (and hardware), but mission critical desktop apps? Are you talking Kerbal Space Program?
 

quarkysg

macrumors 65816
Oct 12, 2019
1,247
841
I’m familiar with mission critical embedded and server software (and hardware), but mission critical desktop apps? Are you talking Kerbal Space Program?
Apologies if my replies implies I'm referring mission critical "desktop apps", if there is such a thing. Well, anything can be mission critical if you cannot afford downtime.

I'm alluding to the fact that Linux on the desktop is not a thing, because logically thinking, if it is so good and free, it should have taken over the world.

Again, everything should have context. I don't think there any one tech that is best for everything. Use the best tool you can find for the problem you need to solve. Arguing in absolute will only cause everyone to go in circles.
 
  • Like
Reactions: Xiao_Xi

name99

macrumors 68020
Jun 21, 2004
2,407
2,308
I'd be curious to hear your thoughts on Linux in the server space.
Linux is probably best understood in terms of Clayton Christensen's _The Innovator's Dilemma_.

Remember that Christensen pointed out that some technologies (usually, but not exclusively when they are new and leading edge), are based handled completely within a single company, because to reach the goals (performance, weight, battery life, whatever) every part of the device has to work optimally with every other part.
But other technologies have been around long enough, and have their use cases well enough understood that they can be standardized, which allows for a different set of advantages (eg competition and volume leading to lower, often much lower) prices.

Linux operates as such a standardized commodity IN SPACES THAT MEET THESE CRITERIA. This is primarily data warehouses where the task to be performed in fairly standardized, as is the hardware. As you move further from this, it becomes less and less desirable. In mobile specifically, it requires Google to do the best it can (which is not always that great) to try to get it to fit a constantly changing CPU architecture (big.small, now three tiers), constantly changing GPU use cases, the new appearance of NPUs, and so on.

Even in the server space, certain technologies grow flabby and ever less fit for purpose as time goes on, if there's no strong constituency for putting up with some pain in order to make things better. The current hypervisor+Linux model is hardly a great use of resources, and ever more serious data center work (eg Nitro, or interaction with GPUs and NPUs) is diverging more and more from "standard" Linux. Security is likewise hardly ideal (it may be better than Windows in this space, but it's not like that's a great endorsement).

As far as I can tell all the large datacenter vendors are running something that's diverging more and more from "public" Linux every year. I'm not sure how it ends, but *for now* everyone involved seems to find it a convenient fiction to pretend that they're all running Linux, even as more and more compute moves onto devices that aren't (and possibly can't, like GPUs) run Linux.
 

name99

macrumors 68020
Jun 21, 2004
2,407
2,308
I can understand that developing a standard when some of its members have a hidden agenda must be a nightmare, but I want to believe that most groups that develop standards don't have those problems.
You want to believe this even after plenty of evidence to the contrary, and repeated explanations of why it doesn't work that way from people inside the system? OK...
I can give you my personal experience which is with codecs. Why do you think the "standard" codec naming changed from MPEG1, 2, 4 to h.264, 265, ...? Because the process within MPEG (ie within ISO/IEC) became so toxic and mired down in politics that the only way for the engineers to move forward was basically to say "screw it" and move the work to a different organization, the ITU.
(This is not the official version of what happened. It IS the version of what happened that you will hear from insiders who were actually part of the system...)

Of course standards groups can work well when there are no high stakes, or if almost everyone involved is primarily an engineer not a manager/politician. But those are not the high profile cases.

You would use Linux if you had mission-critical applications because commercial Linux distribution provides support for 10 years. What other operating system can give you that?
Uh, Windows promises support for ten years with LTSC releases.
IBM will give you all that and a lot more; hell IBM will support you for fifty years.

Ultimately, however, I think this is one of these things where what companies think they want is NOT what is in their best interests. Pretending that software is like purchasing a machine that will operate in the same way for fifty, or even ten, years is not realistic. Software should be considered an on-going expense, like electricity and trash pickup. Software has to interact with a world that is constantly changing (underlying hardware, legal requirements, internet specifications, even user expectations) and has to change to match. It's brittle, and bad company policy, to pretend otherwise.
 
  • Like
Reactions: leman

name99

macrumors 68020
Jun 21, 2004
2,407
2,308
It is in Apple's best interest to develop proprietary standards when it can gain a competitive advantage and to adopt open standards when it can't.
That is *mostly* the wrong dimension to characterize the problem. Believing that the world is run by a small fraction who do things only for corporate advantage is exactly the same as other conspiracy theories, whether anti-semitism or QAnon; it's desperate "thinking" by people who want to imagine that the world is much simpler than it really is.

The correct dimension, in most cases, is what I said two posts above: mature vs immature technologies. Mature technologies can be standardized and commoditized, but this doesn't work for immature technologies. And most of tech is still immature!
Even when you think something is mature, a new use case comes along for which the standard solution is sub-optimal. Linux scheduling was not appropriate for heterogeneous designs (ie big.LITTLE).
It took years to get something that is kinda sorta appropriate, and I strongly doubt (and experience seems to confirm) that it's anything close to as good as what Apple has when you take into account ALL the relevant variables (which threads should be grouped together on a cluster? when to move from P to E core or vice versa? if a core is reporting a lot of stalls because it keeps missing to DRAM is the right thing to increase the speed of the DRAM or to slow down the core?)

Another example would be getting a CPU and GPU to collaborate on a task. Apple can do this well in part because they have a NoC and SLC that transport and honor a variety of flags associated with each cache line that allow the SLC to prioritize various clients and operate the SLC both as a GPU-optimized cache and a communications point between the CPU and GPU. You'd think both GPUs and CPUs are mature technologies, but not when it comes to working together on a common problem...

And standards suffer from the same bikeshedding as internet forums. When Apple want to improve things, the OS team (a small subset) talk to the CPU team (a small subset) and everyone is motivated to get things done. When Linux wants to improve scheduling, you have five people who want to solve the problem along with five hundred people who want to say that ARM sucks, that Alder Lake sucks, that whatever they understood about OS scheduling in the 1990s was the one true solution and anything later is airy fairy nonsense, and so on and so on -- five people who want to solve the problem and five hundred who just want to rant and virtue signal.
 

dgdosen

macrumors 68030
Dec 13, 2003
2,817
1,463
Seattle
Linux is probably best understood in terms of Clayton Christensen's _The Innovator's Dilemma_.

Remember that Christensen pointed out that some technologies (usually, but not exclusively when they are new and leading edge), are based handled completely within a single company, because to reach the goals (performance, weight, battery life, whatever) every part of the device has to work optimally with every other part.
But other technologies have been around long enough, and have their use cases well enough understood that they can be standardized, which allows for a different set of advantages (eg competition and volume leading to lower, often much lower) prices.

Linux operates as such a standardized commodity IN SPACES THAT MEET THESE CRITERIA. This is primarily data warehouses where the task to be performed in fairly standardized, as is the hardware. As you move further from this, it becomes less and less desirable. In mobile specifically, it requires Google to do the best it can (which is not always that great) to try to get it to fit a constantly changing CPU architecture (big.small, now three tiers), constantly changing GPU use cases, the new appearance of NPUs, and so on.

Even in the server space, certain technologies grow flabby and ever less fit for purpose as time goes on, if there's no strong constituency for putting up with some pain in order to make things better. The current hypervisor+Linux model is hardly a great use of resources, and ever more serious data center work (eg Nitro, or interaction with GPUs and NPUs) is diverging more and more from "standard" Linux. Security is likewise hardly ideal (it may be better than Windows in this space, but it's not like that's a great endorsement).

As far as I can tell all the large datacenter vendors are running something that's diverging more and more from "public" Linux every year. I'm not sure how it ends, but *for now* everyone involved seems to find it a convenient fiction to pretend that they're all running Linux, even as more and more compute moves onto devices that aren't (and possibly can't, like GPUs) run Linux.
I'd say Linux (on the server) has stood the test of time - and it will continue to evolve. It won't meet every server need, but there's a solid enough base that it won't be going away anytime soon.

I find your comment above very agreeable - and it makes me think of "evolution" in Wardley Maps (https://learnwardleymapping.com - highly recommended). There's a common concept to these visualized strategies, that over time, innovative ideas become more commoditized and abstracted away in a value chain (S Curves, if you will). Food for thought.
 

Xiao_Xi

macrumors 68000
Oct 27, 2021
1,627
1,101
They rushed the async implementation without stabilizing or properly developing the underlaying language concepts, while important aspects of the core language remained neglected.
The async/await syntax was included in 1.39. Which C feature had not been included by then?

Many times, it is because the standards cannot fulfill what Apple needed at the time. When a suitable standards emerged, Apple is already well on it's way.
In addition to Metal, Apple could also support SYCL and Vulkan.

Why do you think the "standard" codec naming changed from MPEG1, 2, 4 to h.264, 265, ...? Because the process within MPEG (ie within ISO/IEC) became so toxic and mired down in politics that the only way for the engineers to move forward was basically to say "screw it" and move the work to a different organization, the ITU.
The correct question is why codec naming changed from h.xxx to MPEG.
The first digital video coding standard was H.120, created by the CCITT (now ITU-T) in 1984.
The predecessor of MPEG-1 for video coding was the H.261 standard produced by the CCITT (now known as the ITU-T).
H.264 was standardized by the ITU-T Video Coding Experts Group (VCEG) of Study Group 16 together with the ISO/IEC JTC 1 Moving Picture Experts Group (MPEG).

IBM will give you all that and a lot more; hell IBM will support you for fifty years.
Do you have a link to that?

Mature technologies can be standardized and commoditized, but this doesn't work for immature technologies.
I doubt that proprietary standards can be merge into an open standard when that technology matures. So, it is important to start using them at the beginning. Is the metric system mature enough to be accepted by all countries?
 
Last edited:

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Not exactly true. Many times, it is because the standards cannot fulfill what Apple needed at the time. When a suitable standards emerged, Apple is already well on it's way.

Or the standard ends up different from what is expected. Take the GPU APIs as an example. After a bunch of unsuccessful OpenGL attempts and inability of the committee (ARB and later Khronos) to modernize the API, Apple went ahead to put together their own easy-to use flavor of DirectX for mobile (the first Metal). When Vulkan was announced, Apple was listed as one of the supporters. Then Vulkan was released, and it was a hugely complicated mess aimed at compatibility with exotic hardware and convenience of driver developers instead of application developers. I can imagine Apple GPU executives looking at that and saying “screw it, we can do better”. And they did.

The async/await syntax was included in 1.39. Which C feature had not been included by then?

What does it have to do with C features? A bit confused by your wording here…

One particular example I have in mind is const generics. The talent was rushing to ship async because of the pressure of the corporate backer, while at the same time the core language had to hardcode array methods for different array lengths, with them being unavailable for arrays with length over 32. Sure, one can look at it from the prism of pragmatism and say that it works well enough. Which is possibly true. But it also betrays certain kind of sloppiness that ends up diminishing the overall product quality.


I doubt that proprietary standards can be merge into an open standard when that technology matures.

Why would this be the ultimate goal? Why not let the technology develop at its own pace?
 

leman

macrumors Core
Oct 14, 2008
19,516
19,664
Ultimately, however, I think this is one of these things where what companies think they want is NOT what is in their best interests. Pretending that software is like purchasing a machine that will operate in the same way for fifty, or even ten, years is not realistic. Software should be considered an on-going expense, like electricity and trash pickup. Software has to interact with a world that is constantly changing (underlying hardware, legal requirements, internet specifications, even user expectations) and has to change to match. It's brittle, and bad company policy, to pretend otherwise.

Thank you! I always though it was utterly crazy that folks are ok with bringing their car for inspection every year, and yet software is expected to run forever without maintenance, and that in a dynamic environment with fast changing circumstances and threats! This is exactly what leads to mediocrity.
 
  • Like
Reactions: Pet3rK
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.