DCO is a solution

SOC solutions
Dana , how does the programmer know what code has been installed when you drag and drop the 'boxes' into the device, seems insecure, and lacks code visibility? Or am i missing something ?
 
The programmer uses the code api calls associated with the components (in PSOC language
a component is an onchip resource), dragging and dropping does not install code.

You drag and drop a component, dbl click and config it, like the digital filter example
I showed earlier, eg. frequency, stages, type, BW, and then at minimum write a single
line start instruction. The compiler then auto generates setting f() calls to handle that initial
configuration. The APIs facilitate user controlled configuration for real time control,
also configuration (should you decide to do it that way) , like control phase and or freq
in DDS, or Gain in a PGA, or COM buffers or whatever. Just like you do in any processor.
Think of the APIs as precoded optimized drivers done for you. So you use them versus
writing and configuring a ton of HW registers to control the component. But you can get in
the weeds if you want to, typically not the case or needed.

The only additional auto generated code is when build is run there are basic config
instructions created to config stuff like system clocks for CPU, V trim values,......which normally
you do not touch but can should you so desire. So majority of code work is done with f()
calls, and only under unusual cases do you get into the weeds.

So like writing in Fortran or Basic, where you told machine to multiply but not write the ASM,
you code in C, mostly with these API calls.

Here is a partial summary of calls in a PWM component for example you would use :



So your main() looks like main() in basically any C processor IDE, and the compiler adds a file of
C functions for that component in a separate file (you dont touch this) and uses just the calls user
invoked. Generally speaking you do not wade thru a doc looking for register names to set/clear/load
to make a component work.

Additional beauty of approach is you can start/stop a component, so if power focused, can minimize
power when a component not needed.


Regards, Dana.
 
Last edited:
Note in post #2 I showed catalog of onchip components. here is a partial of what
user community has done (you can create your own components out of schematic
capture and / or Verilog) and add to the tool lib. :



Some std 74 series TTL and other stuff like DDS32, Cordic, keypad, 24 bit HW MAC (no cpu intervention)......


Regards, Dana.
 
Last edited:
Think I get your review thanks. its like building your board in the panel , joining the dots. then POC system builds the code. I don't use PIC libraries at all ( my code would make a 'pro' cry ) I do use ' __built in ', I prefer to write a device driver, for instance PORTS_init(); I2C_init(); that sets the registers. Example a 'call' to LCD_print( line, column); does the rest , makes the xxx_init(); device specific but LCD_print will work regardless ( hopefully ) its then you find out the PIC18Fxxx42 is a scratch head device.
Are these SOC 5volt ?
 
1.7 - 5.5 V for 5LP family, 1.8 - 5.5V for 4 family.

Regards, Dana.
 
@grandad, PSOC really did two things, created a GUI programing environment
on top of hardware including routing and analog.

Have you ever looked at block programing ?

One example, imagine how long it would take you to do a talking volt/frequency/period/power/current
meter application .... Here is a basic 2 hour dev time talking voltmeter :


Same but a wifi based problem solver for a remote property control issue :


By the way you dont have to dig out your C handbook/cheat sheet all the time to stop compiler
complaints, thats all taken care of for you..... Kids using this stuff in 6'th grade to program
bots. I use the various block programmers to do fast turn around one off stuff. As well as C
to do the more involved projects.

mBlock, Tuniot, Snap4Arduino, Flowcode, NodeRed, Visuino, Ardublock, once you use one the
rest a piece of cake.


Regards, Dana.
 
Dana, I appreciate your enthusiasm for SOC ,I was like that with CDP1802 and if i was starting again , yes, would be possible to get stuck in, At the moment as a 'maker / hobbyist ' level it would take the 'fun' out of designing, building ,coding , debugging ( starting again ) etc.. HNY .
 
The 1802 was a great part. Same for NSC800 CMOS Z80. In fact one part that foretold the
progress of last and this decade was NEC V25, V35 series. Classic they were too early in
market and so sadly got yanked. One would write ordinary C, and then use microcode
to create and manage HW process concurrently and simultaneously in background. Pre-
cursors to todays multi core parts and FPGA industry. In NEC case our profession suffers
from many of the same ills all people do, change is a challenge.

To close on my end PSOC still requires "designing, building ,coding , debugging", as any
processor solution. All thats different is we do not have to build a system anymore out of
largely discrete transistor logic and paper tape tools / teletype I worked with in late 70's, ugh.
My joy comes when project is basically running and then I start adding the pile of ideas I
have to expand and improve it, and then move on to the next project. Lots of ADD here

The post in #26 was simply to show the breadth of tools now. I have always been a visual
learner, so attracted to GUI solutions, and ones that remove a lot of repetitive work. My first
introduction to that was Processor Expert in Freescale Codewarrior. Now that we have decent
C compilers and IDE configurators and tools the solution process so much more enjoyable.
Just the error checking on typing alone terrific. Modern compilers now with deeper C error
checking. C is as oft quoted as a "strongly typed language", read do it my way or the highway.
I am not a typist


Regards, Dana.
 
Last edited:
C is as oft quoted as a "strongly typed language", read do it my way or the highway.
I was under the impression that C was widely known as a 'poorly typed language', which was why Pascal used to be the teaching language of choice in Universities, because that was a 'strongly typed language', and forced good practice on the user. C allows all kinds of horrible practices, hence it 'can' be very difficult to understand - if written poorly.
 
Dana , I am always struck by how much the silicon world has changed over my techie life , Even now with the speed and density of the chips. I do / try to keep up with advancements via YT mainly , It is understandable the interest in vintage 'computers' I have always be reluctant to 'move with the times' even with mcu devices. not just with the cost but time and equipment needed , I was given a dead oscilloscope , and that helped ) . I guess like many ETO people I knew all the 74 logic numbers and functions almost of by heart, along with ASCII, and Hex etc. Up until a year or so i still kept an XP machine. Now totally linux,mint. Windows... that is one system I was keen to move away from. My 1802 was $15 + shipping a lot of cash back in the 70's , but I had been bitten by the electronics bug .
 



Regards, Dana.
 
grandad, I was building TTL based music synthesizers early 70's, painful, wiring
like a grapevine. In school after Navy around mid 70's got a tech job at Earthquake
Engineering, a department of State of California Water resources. I was assigned
to build a counter array of TTL counters that could be used to rapidly determine
epicenter of Earthquakes. A grad student was designer. Miserable because of all the
switching noise but shortening the story when an Earthquake was detected at first
sensor it would dump all jobs on department mainframe and wait for two more
sensors to report to compute the location. From ~ 3/4 hour to a minute, later mS
as a result of algorithm work.

Times have changed, something to worry about, AI, this video shows just how rapid
AI development is occurring. I have used it (chatgpt) to write code, its learning and
output not crazy. Hang in video until they start doing results testing, absolutely frightening.
When AI asks itself what is most threat to its existence and the planet, its not going
to be squirrels or sharks it picks out, its going to be us.....



Regards, Dana.
 
Last edited:
My first computer coding experience was as a hopeful 'student' sat at a cobol lump of metal the size of a small car , as a test we had to code a bubble sort of numbers. in the least amount of 'words'. Think I came second out of say 6 engineers. I found the m/c had an upper limit of 99999999 so just compared the list against that , Mr tester, did not think that was appropriate approach. And i went back to fixing my gears, pitmans detents and switches, until that all got put in skips. I still have a bunch of DTL, TTL and CMOS. DIL chips , Dont think i will ever dispose of them I even buy more Years ago I did a TTL MiDi I/F for my old ELKA organ, great fun , amazing what you could achieve.
 
When AI asks itself what is most threat to its existence and the planet, its not going
to be squireles or sharks it picks out, its going to be us....
The developers have a "Mission" . What about AI's mission, and has Gemini it got an OFF switch ?
 
I too have a pile of DTL, TTL, CD4000, and some RTL. Any guess how many
good parts are in landfills ? Maybe when earth returns to planet of the apes
they will be digging them out.....

Regards, Dana.
 
The developers have a "Mission" . What about AI's mission, and has Gemini it got an OFF switch ?
Over the years I have read a lot of SciFi, and most of it has come true. Many focused
on the age of machines......

My grandkids are screwed. Maybe even my sons.......
 
Cookies are required to use this site. You must accept them to continue using the site. Learn more…