Detailed resolution (a)

Resolution is an important concept related to image processing. It is a technical parameter that measures the ability of detail representation of an image. However, there are many ways to represent resolution, and their meaning varies. Therefore, it is necessary to correctly understand the specific meaning of resolution in various situations and to clarify the relationship between different representation methods. This article focuses on a brief introduction to the resolution of several common image input/output devices, and then describes the characteristics and relationships of different resolution images, input/output images on different resolution image input/output devices.

First, a few concepts about resolution

To accurately grasp and understand the meaning of resolution, it is necessary to make clear the following concepts.
Resolution: Includes device resolution, screen resolution, graphic resolution, scan resolution, and bit resolution.

Device Resolution: Also known as output resolution, refers to the number of dots per inch that can be produced by various types of output devices, such as the resolution of displays, inkjet printers, laser printers, thermal wax printers, and plotters. This resolution is measured in units of DPI. Generally, PC monitors have device resolutions between 60 and 120 DPI. The print device resolution is between 180 and 720 DPI.
Screen Resolution: Also known as the screen resolution, refers to the number of dots per inch on the screen used to print grayscale graphics or color separations. This resolution is scaled by the number of lines per inch (RPI).

Image Resolution: Refers to the amount of information stored in the graphic. There are several ways to measure this resolution, typically measured in pixels per inch (PPI). The resolution of the graphics and the size of the graphics together determine the size of the file and the quality of the output. The larger the value, the more disk space the graphics file occupies. The resolution of the graphic affects the size of the file in a proportional relationship, that is, the file size is proportional to the square of its graphic resolution. If you keep the size of the graphics unchanged and double its graphics resolution, the file size will increase by a factor of four. The graphics resolution also affects the size of the graphics on the screen. If you increase the graphics resolution from 72 PPI to 144 PPI (keep the graphics size unchanged) on a monitor with a device resolution of 72 DPI, the graphics will be displayed on the screen at twice the actual size of the original graphics.

Scan resolution: The resolution determined before scanning a graphic. It will affect the quality and performance of the generated graphic file. It determines how the graphic will be displayed or printed. If the scanning pattern is used for a 640×480 pixel screen display, the scanning resolution does not have to be greater than the device resolution of a typical display screen, ie generally not exceeding 120 DPI. However, in most cases, the scanning pattern is prepared for later output in a high-resolution device. If the resolution of the graphics scan is too low, graphics processing software may use the color values ​​of a single pixel to create some halftone dots, which can result in rough output. Conversely, if the scanning resolution is too high, the digital graphics will produce more information than needed for printing, which not only slows down the printing speed, but also causes a slight transition in the graphical hue when the printout is output. In general, the print resolution, scan, and output image size should be used to calculate the correct scan resolution.

Multiply the maximum size of the output graphics by the screen resolution, and then multiply by the ratio of screen lines (usually 2:1) to get the total number of pixels required for the graphic. The optimum scanning resolution is obtained by dividing the total number of pixels by the maximum size of the scanning pattern, that is, the pattern scanning resolution=(the maximum size of the output pattern×the screen resolution×the ratio of the screen lines)/the maximum size of the scanning pattern.

Bit Resolution: Also known as bit depth, it is used to measure the number of bits of information stored in each pixel. This resolution determines how many colors can be displayed on the screen at a time, typically 8 bits, 24 bits, or 32 bits. Sometimes we also refer to bit resolution as color depth.