Hello. I was just wondering if anyone could give the basic idea of how digital data is read from a camera (that outputs digitized data). What I have so far is that usually there is an I2C port to control the functions of the camera, and there is a parallel interface that outputs camera data.
-8-pin bus or so for pixel data (I presume for the 8-bits of data per pixel assuming monochrome. I'm not sure how colour data would be outputted unless it outputted the R, G, and B value or whatever 3 values for the pixel sequentially before moving onto the next pixel)
-a pixel clock which I assume clocks the individual bits that represent the data for a pixel (or maybe it is another clock that does this and this pixel clock is actually a Pixel sync pulse to indicate all current information is for the same pixel)
-a frame sync pulse which, when high, indicates all the pixel information is for the same frame
-a line sync pulse which is similar to the frame sync pulse except it indicates all the pixel information coming out is for the same line
Is this the basic idea of how it works? I've heard it's usually best to use a dedicated CPLD or FPGA to read this data straight into RAM where it can be fetched by the processor but I am not sure where to start with CPLDs or FPGAs (I'm leaning towards CPLDs since they are faster, nonvolatile, and reading stuff into a RAM doesn't seem like something so complicated you need an FPGA). THe only ones I've used are the Xilinx Spartan 3s at the university but those were somewhat of a black box. I just followed the instruction sheets. I learned more about the design theory than the actualy usage of the hardware.