
“The design of the existing video surveillance system mostly uses USB video capture and Ethernet transmission, and requires a video compression scheme, which usually requires operating system support; therefore, the selected development platform is expensive, which causes the cost of the video temporary control system to remain high. It is difficult for small factories and home users to accept.
“
The design of the existing video surveillance system mostly uses USB video capture and Ethernet transmission, and requires a video compression scheme, which usually requires operating system support; therefore, the selected development platform is expensive, which causes the cost of the video temporary control system to remain high. It is difficult for small factories and home users to accept.
This subject uses the ARM7 development platform of the S3C44BO microprocessor, drives the USB interface chip CH374 to collect video data, and provides a low-cost video collection solution. USB video capture involves USB synchronous transmission, but in the design of many USB host chips, control transmission and batch transmission are the main focus, and explanations on synchronous transmission are extremely rare. This article provides a design example of USB synchronous transmission.
1 The working principle of the system
A typical USB video capture system is shown in Figure 1. A USB system includes two basic elements: a host and a physical device. A USB system can only have one USB host, and multiple physical devices can be connected. The device in this design is a USB camera, and the USB host is composed of a USB host controller, a microprocessor, and driver software. The working level of the USB system is distinct:
The USB interface layer provides the physical connection between the host controller and the device; in the device layer, the USB host calls the driver to send and obtain the control information of the USB device through endpoint 0; the functional layer carries out the actual data transmission, and the host must select the appropriate interface and endpoint , Call the interface function provided by the bottom driver to obtain the video data stream of the USB camera.
1.1 USB camera SPCA561A
The collection of video signals is generally realized by using a USB camera. As shown in Figure 2, the USB camera SPCA561A integrates a lens, CMOS sensor, USB image processor and USB controller.
Compared with the CMOS sensor that directly interfaces with the microprocessor, although the cost of using a USB camera is higher, it is easy to implement, saves CPU resources, and has very rich driver support. SPCA561A provides a single-chip camera solution, which integrates a CIF CMOS sensor, an image processor and a USB controller on a single chip, thereby greatly reducing the cost and development difficulty; the disadvantage is that there are only 100,000 pixels per second. The number of frames is small, but it is very suitable for small-scale surveillance systems with low image requirements.
1.2 USB host controller CH374
CH374 is a universal interface chip with embedded USB bus, supports USB host mode and USB device mode, supports low-speed and full-speed control transmission, batch transmission, interrupt transmission and synchronous transmission. At the local end, CH374 has an 8-bit data bus, read, write, chip select control lines, and interrupt output, which can be easily connected to the system bus of DSP/MCU/MPU and other controllers. Most embedded USB host interface chips do not provide synchronous transmission mode, and a major feature of CH374 is to provide synchronous transmission, making it possible to transmit video and audio streams.
This system uses CH374 as the USB host controller, as shown in Figure 3. CH374 is connected with S3C44B0 in bus mode, and the microcontroller realizes USB host drive by reading and writing the CH374 register.
1.3 Principle of USB Synchronous Transmission
Synchronous transmission is mainly used to transmit audio or video signals. This kind of information is periodic and real-time. It has high requirements for real-time information, but it can tolerate the bit error rate. Therefore, USB reserves 90% of the bandwidth for this information, and other types of transmission cannot be occupied during synchronous transmission.
To ensure the real-time nature of data transmission, synchronous transmission does not retransmit data errors, nor does it respond to a handshake packet at the hardware level. The host of synchronous transmission sends an SOF synchronization signal every 1 ms, and then receives the signal sent by the device. The data flow is shown in Figure 4.
In synchronous transmission, the capacity of each letter packet is constant. Take SPCA56l as an example, the corresponding interface number must be set before starting synchronous transmission. Different interface numbers determine how much envelope capacity will be sent. For example, the envelope capacity of interface number 1 is 128 bytes each time, and the envelope capacity of interface number 6 is 896 bytes. The interface number is set by the USB standard device request SET_INTERFACE. Since the buffer Z of CH374 is 128 bytes, the interface number 1 is used in this design, and the size of each received packet is synchronously transmitted. The size of the packet is 128 bytes.
1.4 Video data acquisition process
As shown in Figure 5, the video signal is collected by the camera SPCA561A and encoded into a specified format after the internal image processing chip, generally in RGB or YUV format, but SPCA561 uses a special S561 image format (similar to RGB format). Because the amount of data in a frame of image is too large to be transmitted in a synchronous packet, it is divided into multiple units, and a header is added before each unit (the content of the header includes the current packet sequence number and the image frame information) to form Multiple synchronous packets are sent to the USB bus through the FIFO buffer. The host controller receives each letter packet in a synchronous manner, removes the header and merges it into S561 format data to form a complete image frame. After Z, the software pre-encodes this image frame into YUV420 format image data for subsequent compression processing.
2 The realization of USB camera driver
USB cameras are not standard USB peripherals. Different from other USB peripherals, each manufacturer’s camera chip has its own defined device request, and these camera chip data manuals are not publicly available, so it is very difficult to write camera drivers. If you want to drive to support more The camera, the procedure will be very complicated. This article only introduces the method of SPCA561A camera drive.
2.1 USB camera initialization
There are two steps to initialize a USB camera, the DY step is the enumeration of the camera, and the second step is the custom setting of the camera.
(1) Device enumeration
The enumeration of the device is the process of the standard device request. This part of the content is included in Chapter 9 of the USB protocol. For USB cameras, the enumeration process is as follows:
① Obtain the device descriptor. Obtain the load of endpoint 0 through the device descriptor, which is the Z large transmission packet capacity.
②Set the address. Assign an address other than the default address 0 to the device.
③Get the configuration descriptor. This process includes two stages. The first 4 bytes of the configuration descriptor obtained for the first time get the real length of the configuration descriptor; then the configuration descriptor is obtained for the second time with the real length. This descriptor contains the configuration information of the device. And multiple interface information. You can get the available interface number and corresponding envelope load from here.
④Set configuration information. The main information set is the fifth field bConfigurationValue in the configuration descriptor.
⑤Set up the interface. Different interface numbers of USB cameras correspond to different packet loads. The interface number selected in this design is 2, and the corresponding envelope load is 128 bytes.
(2) Custom settings
The USB camera is not a standard USB peripheral. It requires a lot of custom settings, which can be called “custom device request”. It is transmitted in the standard device request packet mode. The purpose is to modify the internal registers to perform the acquisition of images and compression methods. Configuration. The different content of the standard equipment request and the custom equipment request package is listed in Table 1. The content of the custom device request is very rich, and it includes the following aspects:
① Time sequence generation setting. Including the image acquisition frequency and oscillator settings.
②Image processing settings. Including image window size, compression type, color distribution and other configuration attributes.
③Memory setting. Set the image buffer.
④Control and status settings. Including start and stop image acquisition, data transmission mode, current state and other configuration attributes.
There are nearly a hundred initial settings in the program, please refer to references for specific settings[1]Open source code. After initialization, you can set the image format according to your needs. SPCA561A supports four formats: SQVGA (160? 120), QCIF (176? 144), QVGA (320? 240), and CIF (352? 288). After the setting is over, start the camera collection for data transmission.
2.2 Synchronous transmission and image frame processing
The process of synchronous transmission is very simple, and does not even contain handshake information; but because synchronous transmission requires high timing, it is quite difficult to process synchronous transmission data. This driven design uses an interrupt service program to process the reception of synchronous data, and the processing of synchronous data is performed outside of the interrupt service.
① The interrupt service program flow is shown in Figure 6. Each time a synchronization interrupt occurs, first read the received 128-byte synchronization packet from the buffer of the USB host controller, and store the data in the storage unit provided by the data processing program. Then send the PID_IN flag and the endpoint number, set the synchronous transmission type and start the next YC transmission. The CH374 host will send a SOF synchronization mark every 1 ms. After the USB device receives the SOF mark, it will transmit the next synchronization packet.
②The synchronous data processing program is shown in Figure 7. After the interrupt is over, execute the data processing program, the program reads the DY bytes of the synchronous envelope, and confirms the serial number of the packet. The range of this serial number is 0~0xFF. If the sequence number is 0xFF, it means that it is an invalid packet and needs to be discarded. If the sequence number is 0, it may be the DY synchronization data packets acquired for the first time and store this data directly in the image frame; it may also be the DY synchronization packets of the next frame of the image starting after the end of the current image frame, you need Process the current frame image that has ended, and set the current frame as the next frame at the same time. So far, a frame of image data has been obtained.
2.3 Precoding of image data
The processed image frame is S561 format data, which is an RGB format image and cannot be used by subsequent image encoders. Commonly used video compression standards (such as H.263, MPEG4, etc.) input video data in the YUV420 format, so the current S56l format data must be pre-encoded to make it into the YUV420 format.Because the algorithm is more complicated, I will not describe it in detail here, please refer to the literature[1]The source code of the bayer_decode() function. So far, the camera driver based on CH374 is completed.
3 Design experience
Because this USB host is based on a low-end embedded hardware system, there is no operating system support, and no USB data stream analysis software support similar to BUSHOUND. It is difficult to find a USB synchronous transmission reference program based on the embedded platform, so the design is very difficult. The author’s design experience is focused on the selection of the reference program.
The design of this subject can be divided into two parts: one is the underlying CH374 host controller driver, which mainly includes device detection and enumeration (the design of this part of the program can refer to the driver similar to the host controller, such as the SL811HS chip of Cypress) Host driver[3]); The other part is the camera initialization, video data reading and processing program, the only reference material is the open source USB camera driver under Linux. In the design process, you first need to understand the principle of Linux device driver, and then you can have a design idea after a good analysis of the USB camera driver.
Concluding remarks
Add this driver to the existing video compression program, and transmit the video to the PC via the network for playback. Under the image attribute of QVGA (160 to 120), it can reach 7 frames per second, which basically meets the actual needs. The USB camera driver based on CH374 provides a low-cost embedded platform to achieve video capture schemes, which facilitates the video capture system to no longer be unattainable, and plays a positive role in the popularization of video surveillance.
The Links: FZ800R12KF1 FLC48SXC8V-02