OCR, DPI and accuracy

What is OCR? OCR is short for optical character recognition. It’s a technology which anbles you to change any scanned images of handwritten, typewritten or printed text into machine-encoded text. OCR is now used in many enterprises, not just libraries and goverments.(Note: OCR is not new. Because it has been developed since 1912)

Now, OCR can reach 98% accuracy. But it will decrease the accuracy when processing the images with different quality. And for OCR softwares, the quality of images is really important. Basically, the OCR softwares require a 200 megapixel camera with auto focus at least.

The quality of image can be measured by a term called DPI(Dots Per Inch). Unofficially, 300 DPI is the standard quality for image. Because 300 DPI scanning is able to reach the most accuracy without sacrificing speed and file size. Just take a example, the improvement gap between 200 DPI scanning and 300 DPI scanning will be almost 2 times comparing the improvement gap of any other resolutions. However, for 300 DPI scanning and 400 DPI scanning, the improvement gap is nearly zero. Below is an example of a problem not having enough DPI can cause.

Click here to get the latest OCR softwares.  Find much more about OCR technologies at this OCR page.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: