Software that Determines Photo Quality and Advises Photographer When to Reshoot

Technology #15890

Questions about this technology? Ask a Technology Manager

Download Printable PDF

Categories
Researchers
Dapeng Oliver Wu
Ruigang Fang
Managed By
Richard Croley
Assistant Director 352-392-8929

Achieves Human-Level Perception as It Assesses Image Quality in Real-Time Without a Reference Image

This image assessment software can be built into a digital camera or cell phone camera to assess images quantitatively in real-time and warn a photographer when a photo is inferior. Even excluding camera phones, the global digital camera market will reach nearly $20 billion by 2020. University of Florida researchers propose a no-reference image quality assessment software that quantifies images based on blurriness, noisiness and blockiness, critical factors affecting quality of photos, and warns the photographer when a photo is of inferior quality and should be retaken. This software can be utilized in standard cameras as well as included in camera phones.

Application

Image quality assessment software that achieves human-level perception

Advantages

  • Compatible with cellular phone and digital cameras, providing superior automatic image quality assessment to a broad range of products
  • Requires no reference image for quality assessment, achieving human-level perception based on quantitative measures alone

Technology

This software computes values for critical attributes of an image and maps these values in order to determine if the photo will be satisfactory. University of Florida researchers created metrics to analyze blurriness, noisiness, and blockiness, three critical factors affecting users’ quality of experience. This software uses Laplace distribution and the k-Nearest Neighbors algorithm to determine a human perception score of an image. The software requires less computation as compared with existing, non-reference image quality assessment tools, and it achieves human-level quality perception, according to experimental results.