Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

surferfb

macrumors 6502a
Nov 7, 2007
756
2,004
Washington DC
IF you count the AVP as a "computer". I don't, In the same way as I don't count the iPad as a "computer", on the basis that neither of them can close the loop on creating software from scratch for themselves.

This is off topic, but I’ve always wondered this and figured I might as well ask - if Apple were to announce and release full Xcode for iPad at WWDC this year, do iPads retroactively become computers?
 

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
This is off topic, but I’ve always wondered this and figured I might as well ask - if Apple were to announce and release full Xcode for iPad at WWDC this year, do iPads retroactively become computers?

Granted it is off-topic, but yeah If you could write compiled software from scratch on the iPad, and then install and run that software locally, without having to send it off to Apple for approval, then yeah I'd change my classification of it. If you could use an iPad to write and compile iPadOS, that would seem a pretty convincing argument to me.
 
  • Like
Reactions: surferfb

NT1440

macrumors Pentium
May 18, 2008
15,088
22,154
I’ll admit I’m curious. Mostly because I’ve seen this sentiment from a few posters here at MR, and *only* here. The definition of computer has arisen in the last year or two that a computer is only a “real” computer if you can code on it.

Technological history is an interesting topic for me, and in all my reading I’ve never seen this definition come up but here at MR. Did I miss something?
 

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
You say that at the end of the subscription, "all the functionality that is inherent to the hardware" will continue to work. But what does "inherent to the hardware mean?" For example, in some cases you can get 3 monitors and in some cases you can get 5 monitors. That's inherent in the software, not in the hardware.

Good point. What hasn't gotten much discussion here is that the Visor is a client, just like other headsets connecting to Immersed software, the Quest 3 or now (with beta software) the AVP. On the Quest 3 or AVP you run on the headset an App that allows you to connect to your server device. We can presume that software is built into the Visor, but it is still a client. Where it gets interesting is the server app you run on your tethered device that runs the apps that show up in the screen. That's an Immersed App that currently requires a subscription. The subscription is free for 3 screens (and low res) and 5 bucks for the pro version that gives you 5 screens. Now maybe they will wave that for all visors, maybe not, point remains for the Visor to have full functionality it requires both server and client software. And that is NOT inherent to the hardware. Don't update the software, and the hardware fails eventually. lol.

Sure same might be said about the AVP, except the AVP does run its own apps, does not require to be tethered and I dare say the stability of Apple to continue support is higher than a small start up software company trying now to do hardware.

Who knows. While we know the Visor doesn't have any direct video connects like HDMI, maybe built in to the visor client software is the ability to mirror one screen from direct wired connect (not available for the AVP besides the developer strap). We don't know, it's not stated explicitly in their web site or their FAQ.

The only other thing inherent to the hardware is the ability to run (untethered) a 'browser.' Maybe that's enough for some.
 

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
Technological history is an interesting topic for me, and in all my reading I’ve never seen this definition come up but here at MR. Did I miss something?

Mainly because the phenomenon of computer-like devices which can't be used to write their own software is a relatively new thing - in the past, computers would come with the programming tools to make new software by default, or the programming tools were a commercial product you had to buy. Either way, you could write software to run on your computer, on your computer. That was more or less universal, until the iPad.

The test isn't so much whether you can write code - any text editor can do that, but whether you can then compile that code into a native application that works like any other native application. The idea that it's an independent general purpose machine that can be bootstrapped with new functionality that can be made on the machine itself.
 

Analog Kid

macrumors G3
Mar 4, 2003
9,351
12,579
I’ll admit I’m curious. Mostly because I’ve seen this sentiment from a few posters here at MR, and *only* here. The definition of computer has arisen in the last year or two that a computer is only a “real” computer if you can code on it.

Technological history is an interesting topic for me, and in all my reading I’ve never seen this definition come up but here at MR. Did I miss something?

No.

If I had time, and MR had better search capability, I'd like to compile a list of all the people saying the iPad isn't a computer because you can't run Xcode locally and cross reference to a list of all the people who demanded Apple allow the iPhone to behave certain ways (load arbitrary code, for example) because it is a computer.

A computer computes. That's as deep as the definition gets. Used to be humans doing repetitive calculations, they were replaced by machines. Both are called "computers".
 

Analog Kid

macrumors G3
Mar 4, 2003
9,351
12,579
Mainly because the phenomenon of computer-like devices which can't be used to write their own software is a relatively new thing - in the past, computers would come with the programming tools to make new software by default, or the programming tools were a commercial product you had to buy. Either way, you could write software to run on your computer, on your computer. That was more or less universal, until the iPad.

The test isn't so much whether you can write code - any text editor can do that, but whether you can then compile that code into a native application that works like any other native application. The idea that it's an independent general purpose machine that can be bootstrapped with new functionality that can be made on the machine itself.

That is so very far from the truth...

iu


 

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
I’ll admit I’m curious. Mostly because I’ve seen this sentiment from a few posters here at MR, and *only* here. The definition of computer has arisen in the last year or two that a computer is only a “real” computer if you can code on it.

Technological history is an interesting topic for me, and in all my reading I’ve never seen this definition come up but here at MR. Did I miss something?

I think that's a definition of convenience from folks that want to say the iPad is not a computer. Let's not forget there wasn't a native compiler for the Mac 128 K when it was first released. If you wanted to program an App for it you needed to code on a Lisa and then port it over. Was the Mac 128K not a computer?

And let's go to the first widely acknowledged electronic digital computer, the Eniac. No compiler. If you wanted to 'program it' you manually inputed the bits at the machine level code. That's not coding as we know it today.

It's an arbitrary definition restricting to computers you can compile on. And if folks want to cling to it, cool. Doesn't make it true.

And at the end of the day, does it really matter? I am a user. Not a programmer (any more, shades of Tron). I run apps to be productive. I do that with my Mac, my phone, my iPad, and my APV. So cool. According to some with an agenda, three of them aren't computers. They are interactive productivity devices. Good enough for me.
 
Last edited:

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
Mainly because the phenomenon of computer-like devices which can't be used to write their own software is a relatively new thing - in the past, computers would come with the programming tools to make new software by default

Actually no. Historically speaking we can answer the chicken and egg question. The chicken came first. The early computers did not have the tools to make new software. You literally used buttons to program in the bits. The innovation was compilers and that came later. And here we are, again, tools that do what we want but we dont write programs directly on. Back to the future. Sorry.

edit: I am sure you will cute and paste surgically selected text from obscure web sites to support your point. Don't bother. Eniac for the win!
 
Last edited:

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
Fun fact, the PDP in PDP-11 stands for "Programmed Data Processor" because DEC's investors wouldn't support them making a "computer".

oh wow. takes me back to my nights in the basement of the chemistry department with programs on paper tape rolls on the PDP-11. I took a run at the Turing test but obviously fell short. My father worked for DEC, and I spent a lot of time there playing with the new toys. By then Ken Olson had the investors under control. Unfortunately perhaps. Thanks for the trivia!
 

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
Good point. What hasn't gotten much discussion here is that the Visor is a client, just like other headsets connecting to Immersed software, the Quest 3 or now (with beta software) the AVP. On the Quest 3 or AVP you run on the headset an App that allows you to connect to your server device. We can presume that software is built into the Visor, but it is still a client. Where it gets interesting is the server app you run on your tethered device that runs the apps that show up in the screen. That's an Immersed App that currently requires a subscription. The subscription is free for 3 screens (and low res) and 5 bucks for the pro version that gives you 5 screens. Now maybe they will wave that for all visors, maybe not, point remains for the Visor to have full functionality it requires both server and client software. And that is NOT inherent to the hardware. Don't update the software, and the hardware fails eventually. lol.

When you connect a Visor, the software made by Immersed on your Mac / Windows / Linux machine recognises it as a Visor, and enables 5 screens, and high resolution mode, and all the other features they state in the FAQ are part of the product for life. The headset in effect functions like a usb licensing dongle.

There is no subscription to pay to get that functionality.

You really seem to be fixated on this idea that the software on the computer is a separate purchase / subscription, and that it can only be the free, or Pro versions that Immersed advertises for owners of 3rd party headsets. That there couldn't possibly be a third version, one that they don't advertise for sale separately, because it's a part of the Visor purchase - effectively an OEM licence of the software, whose only purpose is in the context of their first party headset.

I don't understand why you can't, or won't integrate that into your worldview.
 

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
That is so very far from the truth...

iu



*sigh* yes, I thought we could take it as read that the phrase "computer" was originally coined to describe the people who operated adding machines in giant book keeping bullpens, and we were talking in the specific context of all-digital computing, and why this "is not a computer" thing had come up in the post-iPad era.
 

Analog Kid

macrumors G3
Mar 4, 2003
9,351
12,579
*sigh* yes, I thought we could take it as read that the phrase "computer" was originally coined to describe the people who operated adding machines in giant book keeping bullpens, and we were talking in the specific context of all-digital computing.

Why would we take as read an arbitrary definition about being "bootstrapped with new functionality made on the machine itself" rather than the actual definition of computer as "something that computes"?

[Wrong use of bootstrap by the way. Bootstrapping is exactly what you'd do if you don't have a running system with a compiler]

As has been explained, digital computers also often don't compile their own code. See: cross-compiler. If you read the PDP article I linked above it explains that mini-computers were programmed by toggling switches on the front panel or by compiling code on a different computer and transferring it by paper tape.
 

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
I don't understand why you can't, or won't integrate that into your worldview.

You are fixated on this. I am not. I already said 'whatever', it doesn't matter to the discussion that the Visor is only a client. That's an issue. If Immersed goes belly up and doesn't support the server side software, the visor is likely to be useless.

But the truth is you don't speak for the company, and you have no better insights on any of this, you are stating in black and white what a product that hasn't even been tested yet will do. You got no more clue than the rest of us.

The rest of us are trying to keep an open mind. Yours is closed.
 

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
Why would we take as read an arbitrary definition about being "bootstrapped with new functionality made on the machine itself" rather than the actual definition of computer as "something that computes"?

NT1440 asked a question in a specific context, and I answered that question, specific to that context. People argue an iPad isn't a computer, because it can't be used to make iPad software, because as a rule of thumb, all other computers today can make their own software.
 

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
NT1440 asked a question in a specific context, and I answered that question, specific to that context. People argue an iPad isn't a computer, because it can't be used to make iPad software, because as a rule of thumb, all other computers today can make their own software.

Don't hide behind 'what other people argue', what do YOU say?
IF you count the AVP as a "computer". I don't, In the same way as I don't count the iPad as a "computer", on the basis that neither of them can close the loop on creating software from scratch for themselves.

Oh. You say the same thing. You are still wrong :). and so are others that claim the same. Shrugs.
 
  • Like
Reactions: Analog Kid

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
You are fixated on this. I am not. I already said 'whatever', it doesn't matter to the discussion that the Visor is only a client. That's an issue. If Immersed goes belly up and doesn't support the server side software, the visor is likely to be useless.

Would you please publicly state your agreement that the capabilities of the Visor - the 5 screen support etc as mentioned in their FAQ as being provided in perpetuity does not require a further subscription.

Because your continued insistence that it does, or will, is what I'm trying to correct.

If Immersed goes belly up it will no doubt be as bad as any company (Apple & Meta included) with proprietary hardware and software drivers going belly up.
 

Analog Kid

macrumors G3
Mar 4, 2003
9,351
12,579
People argue an iPad isn't a computer, because it can't be used to make iPad software, because as a rule of thumb, all other computers today can make their own software.

If you define a computer as something that can make its own software then saying all other computers make their own software isn't a rule of thumb, it's a tautology.

 
Last edited:

mattspace

macrumors 68040
Jun 5, 2013
3,329
2,965
Australia
If you define a computer as something that can make its own software then saying all other computers make their own software isn't a rule of thumb, it's a tautology.


Yeah, so I'm not going to engage with you on this - I answered a sincere question to why a thing was being said "iPad is not a computer", and explained a rationale for why people would say that.

Was it necessary to build that rationale on the entire history of mathematics & computation back to Babbage or the Abacus? I don't think so, in the context of the current era of what Computers typically were prior to the iPad, which is what I took the question to mean.
 
  • Angry
Reactions: G5isAlive

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
Would you please publicly state your agreement that the capabilities of the Visor - the 5 screen support etc as mentioned in their FAQ as being provided in perpetuity does not require a further subscription.

Because your continued insistence that it does, or will, is what I'm trying to correct.

The only thing I continue to insist is I won't be dragged further into the mud over this particular point because in the end its pointless to argue about the pricing model over a product that is still in development, not even being released. Do you not get they haven't finished working on it? Why do you continue to insist your assumptions from the web site is canon? The product itself is still changing. Never mind, don't care.

But here is my public statement just for you since it seems to be so important. If the product is ever actually released, if their pricing model includes in perpetuity that the client (all visors, not just the foundation) will display 5 screens with no further charges or subscriptions, then I will say congrats, you are right. My only point, and btw, others have publicly published the same conclusion and not been corrected by the company, is currently their other clients DO require a subscription price of 5 bucks per month for 5 screens. That's called extrapolating the future from the present and past. That IS their current business model for clients. And sorry, the visor is just another client. But okay, you might be right. Only time will tell. Feel better?
 

ThunderSkunk

macrumors 601
Dec 31, 2007
4,066
4,534
Milwaukee Area
Most of the people I’ve heard using vision pro is for screen mirroring and focusing.
Won’t this product be a direct replacement that’s even lighter so you can wear all day?
Its getting closer to what im looking for (just a 4k hud for CAD) but not for a thousand dingity dang dollarinos

or $1500 with a subscription! No sir, I don’t like it.
 
  • Like
Reactions: G5isAlive

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
Was it necessary to build that rationale on the entire history of mathematics & computation back to Babbage or the Abacus? I don't think so, in the context of the current era of what Computers typically were prior to the iPad, which is what I took the question to mean.

So no one mentioned the abacus but you. We simply pointed out the first electronic digital computers did not have this trait you now claim is critical in the definition of a computer. It's a sign of youth to think history starts with you and what happened before is not material. Spoiler alert, you didn't invent sex either.

Yeah, so I'm not going to engage with you on this - I answered a sincere question to why a thing was being said "iPad is not a computer", and explained a rationale for why people would say that.

a question that arose because YOU, not others, said the iPad was not a computer. So AK is right, you are using circular reasoning. btw. you already stated why you thought the iPad was not a computer in your original statement , why do you think repeating it and ascribing it to others makes it any more valid?

The iPad is a computer. It's not a laptop. It's not a desktop. It's not a mainframe. But that doesn't mean it's not a computer. Our devices have evolved over the years.
 
Last edited:
  • Like
Reactions: Analog Kid

G5isAlive

Contributor
Aug 28, 2003
2,832
4,875
If Immersed goes belly up it will no doubt be as bad as any company (Apple & Meta included) with proprietary hardware and software drivers going belly up.

I think anyone reasonable will admit the odds of Immersed going belly up is MUCH higher than Apple and Meta. Trying to equate them as similar is a fallacy.

Regardless, the discussion is what utility will the Visor have as a standalone device. Not much. The APV doesn't need to be tethered to be useful.

Good day.
 

Analog Kid

macrumors G3
Mar 4, 2003
9,351
12,579
Yeah, so I'm not going to engage with you on this - I answered a sincere question to why a thing was being said "iPad is not a computer", and explained a rationale for why people would say that.

Was it necessary to build that rationale on the entire history of mathematics & computation back to Babbage or the Abacus? I don't think so, in the context of the current era of what Computers typically were prior to the iPad, which is what I took the question to mean.

The reason I find this particular sub-thread interesting is that it's happening in parallel with another sub-thread where you're discussing people being dug in on their positions and insisting they make simple clear public statements of their views.

Meanwhile, rather than acknowledging that the weirdly narrow definition you're applying should maybe yield to the simplicity of the -er grammatical construct of comput-er, you're trying every rhetorical trick you can to hold your ground and obscure an easily documented sequence of flawed reasoning.



I answered a sincere question to why a thing was being said "iPad is not a computer"

The question was not why someone would say "iPad is not a computer":
I’ll admit I’m curious. Mostly because I’ve seen this sentiment from a few posters here at MR, and *only* here. The definition of computer has arisen in the last year or two that a computer is only a “real” computer if you can code on it.

Technological history is an interesting topic for me, and in all my reading I’ve never seen this definition come up but here at MR. Did I miss something?
The question asked was whether saying a computer was defined as something you can code on is a MacRumors specific oddity.



and explained a rationale for why people would say that

No, you can't pass this off as "a" rationale for what other unnamed people would say:
I don't count the iPad as a "computer", on the basis that neither of them can close the loop on creating software from scratch for themselves.
This is a discussion about a statement you made and defended, and presumably your rationale for doing so.



Was it necessary to build that rationale on the entire history of mathematics & computation back to Babbage or the Abacus?

You brought history into the discussion by saying computers that can't compile their own code was a recent phenomenon:
the phenomenon of computer-like devices which can't be used to write their own software is a relatively new thing
It is not a new thing, it's the oldest thing.



in the context of the current era of what Computers typically were prior to the iPad

And you're focusing on that history to the exclusion of all the modern day computers that don't compile their own code. You are ignoring all of my other references to cross-compilers, my pointing out that your use of circular reasoning is why you don't think other devices that don't compile their own code are computers, and pointing out that when people want to argue that the iPhone should behave more like the Mac they say that it is a computer.

Computers that compile their own code from some arbitrary high level language are a subset, I dare say small subset, of all computers.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.