Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

Wizec

macrumors 6502a
Jun 30, 2019
680
778
I have witnessed that people in my university, and a lot of people on social media, have adopted ChatGPT into their workflow. These "co-pilots" are a crucial part of working and they live permanently on the side. You are constantly sending it ideas and editing the draft. For that to happen, you need a keyboard which makes the iPad without a keyboard useless for people working in 2023. Of course, people say you can use the iPad with a keyboard but at that point, you might as well use the MacBook instead.

What are your thoughts on iPad's place in a world of AI butlers, like ChatGPT?
Apple Smart Keyboard Folio for iPad Pro 11-inch (4th, 3rd and 2nd Generation) and iPad Air (5th and 4th Generation) - US English https://www.amazon.com/Apple-Smart-Keyboard-11-inch-iPad-Pro/dp/B08635GZ8H
 
  • Like
Reactions: eltoslightfoot

boss.king

macrumors 603
Apr 8, 2009
6,394
7,647
Why does this have anything to do with the iPad? You've been able to use iPads with keyboards for years, possibly even since day one. Or you can dictate. Or scribble with the Pencil. Or use the on-screen keyboard. If this makes an iPad useless to you then I suspect you probably never really needed one in the first place.
 
  • Like
Reactions: Jumpthesnark

MisterSavage

macrumors 601
Nov 10, 2018
4,850
5,749
I can't say this enough to the edtechs, profs and students that I work with: ChatGPT is *not* a source of factual information. It can appear very impressive and seemingly authoritative, but you get no feedback or indication as to whether the information it provides is correct or not. More than likely, it is not (apparently 15-20% of the time - see below)

I've never heard of anyone describe it as "hallucinating" but that's really hitting the nail on the head. I've been impressed with when ChatGPT is helpful but it has completely made up answers and given me sample code that would not compile several times.
 
  • Like
Reactions: thefourthpope

thefourthpope

macrumors 65816
Sep 8, 2007
1,439
848
DelMarVa
Counterpoint: there are more iPads in my junior-year methodology course than ever before. Students are making more efficient use of iPads to collect, organize, and access the unimaginable amount of information coming at them.

Many of my faculty colleagues, on the other hand, are really nervous that ChatGPT will make their roles obsolete. My take on that is they aren’t terribly good at or creative in their assessments. More critical thinking and specific case study application.
 

thefourthpope

macrumors 65816
Sep 8, 2007
1,439
848
DelMarVa
I've never heard of anyone describe it as "hallucinating" but that's really hitting the nail on the head. I've been impressed with when ChatGPT is helpful but it has completely made up answers and given me sample code that would not compile several times.
I’ve been genuinely surprised at how frequently answers contain information that’s factually incorrect even when accurate information is readily accessible via long-standing resources like textbooks or Wikipedia.
 

KENESS

macrumors regular
Mar 14, 2003
218
660
I have witnessed that people in my university, and a lot of people on social media, have adopted ChatGPT into their workflow. These "co-pilots" are a crucial part of working and they live permanently on the side. You are constantly sending it ideas and editing the draft. For that to happen, you need a keyboard which makes the iPad without a keyboard useless for people working in 2023. Of course, people say you can use the iPad with a keyboard but at that point, you might as well use the MacBook instead.

What are your thoughts on iPad's place in a world of AI butlers, like ChatGPT?

I think looking at a piece of new technology, especially in its infancy, and saying it is going to cause wholesale change to something else that is well established is a fool's errand. Sometimes it does, sometimes it doesn't. The best most accurate predictions... are made after the fact after it has happened. ;)

For example, where is our paperless office? Computers and electronic mail promised us a paperless office. That was all anyone talked about for a while back in the day. Are some offices today essentially paperless? Of course. But, spoiler alert, we're printing more now, by an order of magnitude, than we EVER did before computers, all because of... computers.
 

MisterSavage

macrumors 601
Nov 10, 2018
4,850
5,749
I’ve been genuinely surprised at how frequently answers contain information that’s factually incorrect even when accurate information is readily accessible via long-standing resources like textbooks or Wikipedia.
Same here. I even thought that it must be referring to a third party library and not a built in language feature but it said it wasn't when I pushed back.
 
  • Like
Reactions: TechnoMonk

TechnoMonk

macrumors 68030
Oct 15, 2022
2,606
4,116
Same here. I even thought that it must be referring to a third party library and not a built in language feature but it said it wasn't when I pushed back.
Yep. I have got weirdest of responses when I asked it to convert pythons code to support Apple GPU. Gave me functions and arguments Opencv libraries don’t support, and then CGPT corrected after few iterations.
It does well with front end react frameworks.
It’s still much more advanced than github copilot, which I use a lot.
 

mkelly

Cancelled
Nov 29, 2007
207
218
I've never heard of anyone describe it as "hallucinating" but that's really hitting the nail on the head. I've been impressed with when ChatGPT is helpful but it has completely made up answers and given me sample code that would not compile several times.

Funnily enough, "hallucinating" is the accepted term what happens when an AI "invents" new information. I wish I could claim that I came up with it lol :)

It's best to think of chatGPT and its friends like a super-advanced version of autocomplete on your iPhone. Several years ago, there was a brief fad where people would open up notes on their iPad and type an initial word or phrase - then just tap the autocomplete suggestions repeatedly to see what it would "create". The results were often nonsensical and occasionally funny. That's chatGPT, but with a much larger dataset to work from (and some fancier algebra in the backend).

Here's a good article on the whole thing that makes a great analogy between the information in chatGPT, and blurry JPEGs.

 
Last edited:

1800AirTAG

macrumors 6502
Aug 2, 2014
258
655
I won't call iPad useless, because many people have different uses for it. But in my personal opinion, ChatGPT didn't make iPad useless, the lack of running a proper file system and proper multitasking did.
 

mkelly

Cancelled
Nov 29, 2007
207
218
I don’t envy future teachers having to face or question whether an essay was written by a student or their app.
This isn't a "future teachers" problem. It's actually happening today. So far though it hasn't been too hard to catch them thanks to the propensity of AIs like chatGPT to generate incorrect information in their output. Many of the students doing this don't catch the AI's errors because they don't do the work to know/understand the subject well enough in the first place. When the instructor grades it the incorrect info typically stands out.

A big issue is the extra workload on the instructors if they want to pursue an academic integrity case against the student. Many of them just feel it's not worth the time and hassle. Some will let the student squeak by, some will fail the student, in which case they'll just see that student in their class the following year (or if they're lucky, the student will take the same class with a different instructor, passing the buck), and a small number will call the student out ... depends on the academic institution and what their policies and procedures are around academic integrity.
 

TechnoMonk

macrumors 68030
Oct 15, 2022
2,606
4,116
Funnily enough, "hallucinating" is the accepted term what happens when an AI "invents" new information. I wish I could claim that I came up with it lol :)

It's best to think of chatGPT and its friends like a super-advanced version of autocomplete on your iPhone. Several years ago, there was a brief fad where people would open up notes on their iPad and type an initial word or phrase - then just tap the autocomplete suggestions repeatedly to see what it would "create". The results were often nonsensical and occasionally funny. That's chatGPT, but with a much larger dataset to work from (and some fancier algebra in the backend).

Here's a good article on the whole thing that makes a great analogy between the information in chatGPT, and blurry JPEGs. It's a very good explanation of what's happening behind the scenes with these systems, and why they can't help but "hallucinate".

The funny thing is the guy who wrote the article doesn’t really understand AI. ChatGPT isn’t an algorithm, and his understanding is pretty flawed.
 

zerocharlie

macrumors member
Jul 6, 2022
47
113
People on the internet use such strong language to make a point or draw attention (sometimes both).

“Useless”.

The iPad has plenty of uses for millions of people that are unrelated to ChatGPT. By definition, that means the iPad still has use.

Perhaps you’re a Drama major. Definitely not Communications.
 
  • Haha
Reactions: Jumpthesnark

mkelly

Cancelled
Nov 29, 2007
207
218
The funny thing is the guy who wrote the article doesn’t really understand AI. ChatGPT isn’t an algorithm, and his understanding is pretty flawed.

It's an analogy and not meant to be interpreted literally. It's for the layperson that doesn't understand linear algebra, tensors, vector spaces, training models, etc. The point wasn't to explain the intricate inner workings of ChatGPT & similar models, it was to give the reader something they could relate it to that they were familiar with.

I mean it'd be *great* if the average person *did* understand tensor algebra and all the rest. Because then they likely wouldn't be running around preaching about how wonderful and amazing and magical AI is (as an all-purpose solution to everything), and would understand what it's good at and what it isn't.

But since we're not there yet, education-wise, analogies will have to do.

(While we're at it, I'd also like to see tech reviewers adopt a basic understanding of thermodynamics - it would help them understand why they can't have all the processing power, battery life and minimal heat we want in a wafer-thin device ... lol)
 
Last edited:
  • Like
Reactions: Night Spring

TechnoMonk

macrumors 68030
Oct 15, 2022
2,606
4,116
It's an analogy and not meant to be interpreted literally. It's for the layperson that doesn't understand linear algebra, tensors, vector spaces, training models, etc. The point wasn't to explain the intricate inner workings of ChatGPT & similar models, it was to give the reader something they could relate it to that they were familiar with.

I mean it'd be *great* if the average person *did* understand tensor algebra and all the rest. Because then they likely wouldn't be running around preaching about how wonderful and amazing and magical AI is (as an all-purpose solution to everything), and would understand what it's good at and what it isn't.

But since we're not there yet, education-wise, analogies will have to do.

(While we're at it, I'd also like to see tech reviewers adopt a basic understanding of thermodynamics - it would help them understand why they can't have all the processing power, battery life and minimal heat we want in a wafer-thin device ... lol)
It's an analogy and not meant to be interpreted literally. It's for the layperson that doesn't understand linear algebra, tensors, vector spaces, training models, etc. The point wasn't to explain the intricate inner workings of ChatGPT & similar models, it was to give the reader something they could relate it to that they were familiar with.

I mean it'd be *great* if the average person *did* understand tensor algebra and all the rest. Because then they likely wouldn't be running around preaching about how wonderful and amazing and magical AI is (as an all-purpose solution to everything), and would understand what it's good at and what it isn't.

But since we're not there yet, education-wise, analogies will have to do.

(While we're at it, I'd also like to see tech reviewers adopt a basic understanding of thermodynamics - it would help them understand why they can't have all the processing power, battery life and minimal heat we want in a wafer-thin device ... lol)
It is a terrible analogy. First of all lossy jpeg or Xerox using compression and copying doesn’t link or understand information. It’s a copy, and fills the noise with interpolation.
ChatGPT doesn’t copy or quote verbatim with the information which was fed in training. In fact ChatGPT can apply complicated mathematical concepts to other fields. I can ask chat GPT, how can I apply this mathematical concept in computer vision. It does come up with good idea. Ask it to code, it will give good stuff using opencv. Now request the same in Pytorch3d, it’s starts guessing based on what’s been learned, and comes with non existent libraries.
Yes it has its flaws, but to even compare to lossy jpeg shows clear lack of basic logic or understanding. Sure it uses some statistical similarities, but that’s like saying Ford Pinto, Yogo are same as Tesla or any other car.

For some reason, it does really well with optical flow capture of videos, depth maps and all complicated algorithms. Ask it to give a simple code to remove background using Apple/AMD GPU it comes up short.

I will happily pay for chatgpt to not deal with google search or others, if it keeps improving with more relevant training.
 

TVreporter

macrumors 68020
Mar 11, 2012
2,056
3,418
Near Toronto
My two kids use my wife and my iPads to play a few games, chat with their friends and watch Netflix.

Everyone’s use is different - don’t try to say they’re useless.
 
  • Like
Reactions: Jumpthesnark
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.