My journey in electronics/instrumentation was somewhat mentored by National Instruments.
This https://electronics360.globalspec.c...g-ni-ceo-reflects-on-the-past-and-looks-ahead article sums it up on their side.
When you go a "little earlier" than the start of the article, NI developed hardware and software drivers for the PDP-11 and oddly enough, I had aPDP-11 background from High school and college, just not at the real-time level.
The "growing pains" started when, the boss said we had to modernize our testing from BCD interfaced meters and x-y recorders. A minor jump was eliminating the BCD instruments and convert to IEEE-488. So, guess what? The testing process got SLOWER. But, the BCD meters were getting obsolete.
The next part of the process was eliminating the X-Y recorders. My comment was "The software we need isn't here yet, but boss said the money was".
NI had introduced LabView on the MAC and with IEEE-488 we cobbled together what parts we had and made a platform change. I wanted an SMU based system, but was told no. This was basically cost and we were building two systems, one to be a spare and mobile. I think we swapped out one item while the other was being repaired during it's 17 year life. The MAC never lost a hard disk. Just one floppy drive, some dust and a fan. A server for data was nixed.
A MAC server maintained by us and a Linux based SMB server (maintained and backed up by our IT department) did show it's face.
Slow as usual. The conception was to print tests as they were done, but that turned out to be impractical. The printer did better spitting out stuff continuously rather than warming up. Laser printers were new too.
Trying to be a manager and developing some custom hardware (all related) for yet another system was basically too much for me. I had one inexperienced person as myself basically working under me, So, I was overwhelmed.
So, me choosing LabVIEW and the MAC paved the department's use of Mac computers for primarily desktop publishing. I liked the MAC's flat memory model and long fie names (32 characters at the time vs 8.3 for Windows). Microsoft Word was developed for the Mac market until the PC caught up.
The PC caught up and other instrument now had PC Centric interfaces/programs. In another system I was TOLD to use cards in a MacIIx based system to computerize something. I did not want to do that, I was over-ruled, Bad move. Slots from Macs and PC's disappeared.
The 17 YO system got overhauled again with SMU's like I wanted. I did not do the programming. An interface to Excel now existed rather than comma separated data, so individual tests and summary data was available for each test. This one was PC based. Still slow.
Unlike the previous system where data was displayed as acquired (which was also a slow process), you had to wait 2 minutes to see if the data was good.
I guess in the grand scheme of things, one could use meters with a graphical display now.
labview made it impossible to plot a single point on a graph. You had to plot 1, then for point 2, 1&2; for point 999, point 0-998 for 999 points 0-999; real slow on say a 66 Mhz machine.
NI kept updating stuff during the development and went from a MAC only model to a Linux/MAC and PC model. Then the PC became the standard for both publishing and laboratory equipment.
It was weird to make decisions in a snapshot of time where stuff didn't exist, was too slow or was constantly changing (software) and being overwelmed with technology.
One decision I was FORCED to make was to design an interface to an existing monochometer and to build electric shutters and a filter wheel. There were no drivers for the stepper drivers and it probably would have been best if I wrote them. The stepper driver could be programmed with it's own basic-like program, so it could handle the filters and even moving the wheel.
At the very end, a big empty box controls one shutter and "they" probably have no idea where the code is. The world developed an IEEE-488 based monochrometer, BUT our old one was better. It had a real-time interface that controlled start/stop and counts using a PDP-11. So, the intent was to control the stepper directly with a programmable interface. I learned a lot. The software got unreadable on a Syquest drive, but it wasn't complete anyway.
In terms of "process control" type stuff, it seems that the "research variety" is much harder to do. So, I never had training in the say PLC/Wonderware type of approach, but did stuff with the real-time approach.
To advance a synchonous motor was "cake" to do with the PDP-11 whereas the PC required counters and hardware to do it. e.g. move for x mS.
Research gizmos are really hard when management doesn't know what's required and the vendors aren't clear. One example is that it was unknown that you needed to purchase a separate programming package for the stepper controllers to program sequences in a basic like language. Or that I needed a special type of converter to make RS422 signals compatible with the higher level RS232 signals. All of the stepper stuff I had was really useful information.
This https://electronics360.globalspec.c...g-ni-ceo-reflects-on-the-past-and-looks-ahead article sums it up on their side.
When you go a "little earlier" than the start of the article, NI developed hardware and software drivers for the PDP-11 and oddly enough, I had aPDP-11 background from High school and college, just not at the real-time level.
The "growing pains" started when, the boss said we had to modernize our testing from BCD interfaced meters and x-y recorders. A minor jump was eliminating the BCD instruments and convert to IEEE-488. So, guess what? The testing process got SLOWER. But, the BCD meters were getting obsolete.
The next part of the process was eliminating the X-Y recorders. My comment was "The software we need isn't here yet, but boss said the money was".
NI had introduced LabView on the MAC and with IEEE-488 we cobbled together what parts we had and made a platform change. I wanted an SMU based system, but was told no. This was basically cost and we were building two systems, one to be a spare and mobile. I think we swapped out one item while the other was being repaired during it's 17 year life. The MAC never lost a hard disk. Just one floppy drive, some dust and a fan. A server for data was nixed.
A MAC server maintained by us and a Linux based SMB server (maintained and backed up by our IT department) did show it's face.
Slow as usual. The conception was to print tests as they were done, but that turned out to be impractical. The printer did better spitting out stuff continuously rather than warming up. Laser printers were new too.
Trying to be a manager and developing some custom hardware (all related) for yet another system was basically too much for me. I had one inexperienced person as myself basically working under me, So, I was overwhelmed.
So, me choosing LabVIEW and the MAC paved the department's use of Mac computers for primarily desktop publishing. I liked the MAC's flat memory model and long fie names (32 characters at the time vs 8.3 for Windows). Microsoft Word was developed for the Mac market until the PC caught up.
The PC caught up and other instrument now had PC Centric interfaces/programs. In another system I was TOLD to use cards in a MacIIx based system to computerize something. I did not want to do that, I was over-ruled, Bad move. Slots from Macs and PC's disappeared.
The 17 YO system got overhauled again with SMU's like I wanted. I did not do the programming. An interface to Excel now existed rather than comma separated data, so individual tests and summary data was available for each test. This one was PC based. Still slow.
Unlike the previous system where data was displayed as acquired (which was also a slow process), you had to wait 2 minutes to see if the data was good.
I guess in the grand scheme of things, one could use meters with a graphical display now.
labview made it impossible to plot a single point on a graph. You had to plot 1, then for point 2, 1&2; for point 999, point 0-998 for 999 points 0-999; real slow on say a 66 Mhz machine.
NI kept updating stuff during the development and went from a MAC only model to a Linux/MAC and PC model. Then the PC became the standard for both publishing and laboratory equipment.
It was weird to make decisions in a snapshot of time where stuff didn't exist, was too slow or was constantly changing (software) and being overwelmed with technology.
One decision I was FORCED to make was to design an interface to an existing monochometer and to build electric shutters and a filter wheel. There were no drivers for the stepper drivers and it probably would have been best if I wrote them. The stepper driver could be programmed with it's own basic-like program, so it could handle the filters and even moving the wheel.
At the very end, a big empty box controls one shutter and "they" probably have no idea where the code is. The world developed an IEEE-488 based monochrometer, BUT our old one was better. It had a real-time interface that controlled start/stop and counts using a PDP-11. So, the intent was to control the stepper directly with a programmable interface. I learned a lot. The software got unreadable on a Syquest drive, but it wasn't complete anyway.
In terms of "process control" type stuff, it seems that the "research variety" is much harder to do. So, I never had training in the say PLC/Wonderware type of approach, but did stuff with the real-time approach.
To advance a synchonous motor was "cake" to do with the PDP-11 whereas the PC required counters and hardware to do it. e.g. move for x mS.
Research gizmos are really hard when management doesn't know what's required and the vendors aren't clear. One example is that it was unknown that you needed to purchase a separate programming package for the stepper controllers to program sequences in a basic like language. Or that I needed a special type of converter to make RS422 signals compatible with the higher level RS232 signals. All of the stepper stuff I had was really useful information.