Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I’m just curious, if I were to set up a brand new, modern lab doing the sort of applications you describe, would the manufacturers set me up with equipment that requires terribly outdated computers and software to operate them? It would strike me as odd that development of scientific and medical instrumentation had somehow stopped cold in the 1990s...

It's because in many academic circles, time is considered to be free and results are basically worthless. In academia, salaries and funding sources are largely decoupled from the actual time spent on projects, which is borderline illegal but tolerated. Nobody considers that a test costs $x in labor.

If it was a commercial lab, or results that would lead to a commercial medical product, all that equipment would be trash because it fails to comply with FDA Data Integrity guidance, drug cGMP, European Pharmacopoeia 9.7, NIST traceability, etc. Only in a university is severely obsolete equipment producing questionable data tolerated.
 
If it was a commercial lab, or results that would lead to a commercial medical product, all that equipment would be trash because it fails to comply with FDA Data Integrity guidance, drug cGMP, European Pharmacopoeia 9.7, NIST traceability, etc. Only in a university is severely obsolete equipment producing questionable data tolerated.

So, let’s say I tune and calibrate my HP 5971 per EPA methods(internal tune with PFTBA, verify mass accuracy and ratios periodically with DFTTP), why do you consider that data questionable, or for that matter any worse than a new Agilent 5977?

If, for example, I were to buy a new, in date, NIST traceable polystyrene reference and use that to calibrate a 20 year old FTIR, then verify it periodically, why is the data from it questionable?

These are things I do periodically, btw.
 
So, let’s say I tune and calibrate my HP 5971 per EPA methods(internal tune with PFTBA, verify mass accuracy and ratios periodically with DFTTP), why do you consider that data questionable, or for that matter any worse than a new Agilent 5977?

If, for example, I were to buy a new, in date, NIST traceable polystyrene reference and use that to calibrate a 20 year old FTIR, then verify it periodically, why is the data from it questionable?

These are things I do periodically, btw.

Our recent equipment refreshes have all come with FDA Data Integrity, cGMP, etc. compliant by default. They have data integrity and anti-tamper features, including forced equipment self-checks (you must measure cal references periodically with internal comparison routines, if it does not pass, the system locks out), files are cryptographically signed and contain a log of any data manipulation, with the ability to always roll back to the acquired data, software versions and firmware are matched against a protocol and a certificate is generated on install, beampath accessories all have identification ICs on them and the system monitors the compatibility, again locking out if the configuration is unapproved.

These are real FDA and EU requirements, especially considering scandals like Theranos. Your 20 year old equipment will not have these features and cannot meet regulatory requirements.

And if it's human samples, now you have HIPAA and IRB cybersecurity requirements, which I guarantee you cannot meet with Mac OS 9.

Perfect example: FTIRs. The HeNe and sources all have a lifetime. Sure they're a headlight bulb and oven igniter, but if you can't get manufacturer spares and are buying the pieces off eBay, your source stability isn't traceable and therefore your measurements aren't guaranteed to not drift over time. On top of that, things like the ADC boards require factory cal.

The other very strict market for things like FTIRs is for crime labs. If the manufacturer isn't going to say the equipment is calibrated and maintained to their standards, it will rightly cause an issue in court, where people's lives are at stake.

The issue isn't whether you trust it, but whether others can trust what you did.
 
Our recent equipment refreshes have all come with FDA Data Integrity, cGMP, etc. compliant by default. They have data integrity and anti-tamper features, including forced equipment self-checks (you must measure cal references periodically with internal comparison routines, if it does not pass, the system locks out), files are cryptographically signed and contain a log of any data manipulation, with the ability to always roll back to the acquired data, software versions and firmware are matched against a protocol and a certificate is generated on install, beampath accessories all have identification ICs on them and the system monitors the compatibility, again locking out if the configuration is unapproved.

These are real FDA and EU requirements, especially considering scandals like Theranos. Your 20 year old equipment will not have these features and cannot meet regulatory requirements.

And if it's human samples, now you have HIPAA and IRB cybersecurity requirements, which I guarantee you cannot meet with Mac OS 9.

Perfect example: FTIRs. The HeNe and sources all have a lifetime. Sure they're a headlight bulb and oven igniter, but if you can't get manufacturer spares and are buying the pieces off eBay, your source stability isn't traceable and therefore your measurements aren't guaranteed to not drift over time. On top of that, things like the ADC boards require factory cal.

The other very strict market for things like FTIRs is for crime labs. If the manufacturer isn't going to say the equipment is calibrated and maintained to their standards, it will rightly cause an issue in court, where people's lives are at stake.

The issue isn't whether you trust it, but whether others can trust what you did.

Here's the thing-we're not doing anything admissible to court or other regulatory agency, and frankly as an instrumental chemist I DON'T want things locked down.

Further to that, I think it's a bit rich to say that our data is "trash" because certain regulatory agencies won't accept it as correct. I've had plenty of data that I generated or helped generate published in JACS, Angewante Chemie, and subsidiary journals of their parent publishers. They don't have an issue with the data, and that's good enough for us. To be perfectly frank, sometimes we DO need to get hands-on with our instruments to do things like design experiments that you won't find published in a standard method. That's not to say that standardized methods don't have a place, but when you're trying to do things like characterize a newly-synthesized compound or observe something that was previously undocumented, often what has been written for mass consumption(such as by a regulated lab) just doesn't work for us.

To your point on FT-IR-I don't replace HeNe lasers on any set schedule, but at the same time if my results start looking iffy, it's one of those things I look at. I've put new lasers in both of my FT-IRs in the past year. The older Mattson that has a few very specific applications also got a new Glo-Bar last fall since I could not get it to calibrate correctly with the new laser. So yes, all of that stuff gets looked after.

Also, to that point, Nicolet will sell you a nice, locked down, can't tinker with it FT-IR, or a much nicer, more expensive, "research grade" instrument that WILL actually let you get down and dirty with the optical bench to your heart's content. I was trying to get the money to buy an FT-Raman earlier this year, and one of the options our Nicolet salesman(who I've known for many years, but has since retired) floated to me was a combined FT-IR/Raman that would give us a lot of flexibility. Ultimately, the money wasn't there.

Also, to that point, one of the "hot products" in mass spec right now is Thermo's Orbitrap. They've gone all in on them and have totally exited the superconducting FT-ICR-MS market, but I have serious concerns about the quality of data that those instruments turn out. I don't want to say too much publicly, but after seeing a lot of the "funny math" that goes into making the Orbitrap work, I distrust its mass accuracy beyond +/- 200 amu of ~250. At the same time, it's too much of a "black box" to really see WHAT is going on.
 
I've had plenty of data that I generated or helped generate published in JACS, Angewante Chemie, and subsidiary journals of their parent publishers. They don't have an issue with the data, and that's good enough for us.

And that's my point. You're in a strictly academic environment where data integrity doesn't matter and you're just finding data to support a foregone conclusion. The majority of instruments, according to all the reps we've talked to, are sold to the industrial market, where they buy something with a 10 year lifetime, properly amortize the costs, and at the end, trade it up to an updated unit. The majority of customers are different from you.

The moment you step over to even a medical school's research environment, where as I said HIPAA and IRB privacy regulations kick in, plus the data might go to the FDA, so you simply can't use decades old obsolete equipment. There's going to be quite some turnover now that Windows 7 is going out of service.

That's your Nicolet stuff that's locked. Our Bruker instruments can go in and out of validated mode by changing the settings.
 
I forgot where I saw it, but I think one of the kernel developers of OS 9 said, unlike every previous version of Mac OS, it is a true multitasking kernel. Before 9, multitasking was done via a system extension only.

With this, what I mean to say is that if anyone actually wanted to do it (no one does, understandably), the best approach for OS 9 would be to build a web browser from the ground up, and not a fork like Classilla and TenFourFox. And that browser, to reach true full potential, would perhaps have to downright ignore anything before 9, unlike Classilla and the browser it was based on.

This is all untested theory, though. God only knows the HELL that would entail to get a browser done right in OS 9, from the ground up, and what results are actually achievable.

Classilla has me pleased, though. So all is good.

I hate to revive an old thread. I just pulled out my G4 Pismo and OS 9.2.2 works like a charm. Also, downloaded Classila and tried even icab of which both browsers render very well. True, we can't really watch youtube under OS 9, but I have faith that those over at os9lives.com will come up with ways to make OS 9 do things that were never imagined. I think its resurrected and as one who grew up with The Mac and bought the 1st Sawtooth as an employee of Comp USA back in 1999, I like OS 9 a lot. I also am using Tiger as well.
 
  • Like
Reactions: AphoticD
Since the thread is revived, I wanted to ask about something I thought of: I'm no dev, in fact I have (practically) no programming abilities what so ever. But as I tinkered with my Win98 computer, I discovered a new browser based on Firefox 2: RetroZilla. I know that Classilla is based on Mozilla 1.7 and that is quite older, but are there stuff we could possibly backport from RZ to Classilla? Even minor backported tweaks could probably improve Mac OS 9 websurfing experience I think...

What other, more knowledgable people, think about that ?
 
  • Like
Reactions: Dronecatcher
I discovered a new browser based on Firefox 2: RetroZilla.

I remember gunning for Youtube with Retrozilla under VPC/OS9...but with no success.

 
Well OS X on a G3 is pretty much unable. All I think is that even minor tweaks could improve Classila which is unlike what many say, still usable for a unupdated browser.
 
In my experience, Tiger likes RAM as much as anything.

Some of the early G3s with low RAM ceilings(~384mb) like Wallstreets can be a little sluggish on Tiger. Use a beige desktop/tower(or AIO) with 768mb, or an iMac or B&W with 1gb, and Tiger runs great. Of course, the PowerMac based ones benefit from plentiful CPU upgrades and easy overclocking.

Unfortunately, my fastest G3 can only officially run 10.4.9...
 
In my experience, Tiger likes RAM as much as anything.

Some of the early G3s with low RAM ceilings(~384mb) like Wallstreets can be a little sluggish on Tiger. Use a beige desktop/tower(or AIO) with 768mb, or an iMac or B&W with 1gb, and Tiger runs great. Of course, the PowerMac based ones benefit from plentiful CPU upgrades and easy overclocking.

Unfortunately, my fastest G3 can only officially run 10.4.9...
10.4.9? Anything that "officially" runs Tiger will be capped at 10.4.11

Aside from that, I concur everything else mentioned. My 300Mhz Beige G3 runs fine with Tiger. It's actually faster than OS 9, depending on what you're doing.
 
@Surrat knows better than I do, but there's something weird about this particular CPU that stuff breaks past 10.4.9. Past 10.4.9, the CPU runs at 500mhz.

Mine is a 1ghz G3, while Surrat has a 1.1ghz.
Its an upgrade CPU, I presume? I don't have any experience with them but I remember Tom on the IMNC YouTube channel having issues with his "World's Fastest Power Mac G3" when running under 10.4.11. The software that talks to the upgrade card is incompatible with the later Tiger updates IIRC.
[automerge]1587483027[/automerge]
 
Yes, it's an upgrade.

And yes, they do have issues with 10.4.11, hence why you run in 10.4.9.

Mine is set up as an OS 9 box primarily, as it has a Voodoo 5 in it. That's a superb OS 9 card, but not great in OS X.
Ok, that is interesting and makes sense. I'll be honest, I didn't know that there were mac Voodoo cards :p
 
Ok, that is interesting and makes sense. I'll be honest, I didn't know that there were mac Voodoo cards :p

There was one and it's fairly scarce.

It's actually one of those few times where the PC guys bought the Mac edition to flash. The Mac Voodoo 5 5500 was-to my understanding-the only Voodoo made with DVI.

Unfortunately, very little Mac software exists that can actually leverage all the stuff the card can do.
 
There was one and it's fairly scarce.

It's actually one of those few times where the PC guys bought the Mac edition to flash. The Mac Voodoo 5 5500 was-to my understanding-the only Voodoo made with DVI.

Unfortunately, very little Mac software exists that can actually leverage all the stuff the card can do.

I was curious and found a Mac version for sale on eBay.

$750 USD, wow!
 
I was curious and found a Mac version for sale on eBay.

$750 USD, wow!

I paid plenty for mine, but not anywhere NEAR that much...they do come up if you want one and watch for it.

Interestingly enough, I bought mine and the 1ghz at the same time. Both showed up in separate B&W G3s on Ebay UK. It would have cost a pile of money to have them shipped here complete, and of course I'd have opened the boxes to find a pile of white plastic.

Instead, I had them shipped to my good friend @LightBulbFun , who stripped the 1ghz and the Voodoo out, along with some other goodies from those, and just shipped that to me. He got to keep the B&W towers, and also got to play with those two pieces for a few weeks before they got sent on to me.
 
There was one and it's fairly scarce.

It's actually one of those few times where the PC guys bought the Mac edition to flash. The Mac Voodoo 5 5500 was-to my understanding-the only Voodoo made with DVI.

Unfortunately, very little Mac software exists that can actually leverage all the stuff the card can do.
That's interesting! I need to do more research on 3D accellerator cards during the late 90s and early 2000s in general. I have a preliminary knowledge, but not as much as I'd like.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.