PSCR envisions a future where computer vision systems deploy complex strategies to get around camera impairments. The goal of this challenge is to establish a starting point for this new line of research by crowdsourcing: 1) media with camera impairments that cause computer vision algorithms to fail; and 2) proposals for how to best quantify those failure rates. These resources will kick-start research on the missing component: no-reference (NR) metrics for image quality assessment (IQA) and video quality assessment (VQA) that identify quality problems that will cause computer vision algorithms to fail.
What are no-reference (NR) metrics for image quality assessment (IQA) and video quality assessment (VQA)?Hailey Day2020-07-23T11:05:58-06:00
NR metrics for VQA estimate the quality of videos, as they would be perceived by human observers. NR metrics for IQA do the same for images. This challenge will enable NR metrics designed for computer observers (computer vision applications). These metrics are called “no reference” because they cannot refer to a pristine version of the media or other metadata.
In addition to assessing the overall media quality, the NR metric must identify the specific quality problems that contribute to the overall quality. That is, the NR metric cannot just say, “the quality is bad;” it must also explain why the quality is bad, so that the computer vision system can act accordingly.
What are the two categories for submissions?Hailey Day2020-08-24T12:08:08-06:00
Contestants may submit datasets demonstrating either single camera impairments or single computer vision applications.
Single camera impairment submissions demonstrate specific impairments that appear across images or videos used in a dataset, such as lens flare or dirt on a lens.
Judges will evaluate these submissions according to: impairment diversity; impairment realism; breadth of impairment severity; depiction of false positives; and to what extent other impairments are avoided.
Single computer vision application dataset submissions present a variety of conditions used in a specific scenario such as automated driving.
Judges will evaluate these submissions according to: how well the dataset portrays the computer vision scenario; how well the dataset portrays the variety of camera impairments that will be encountered when this computer vision algorithm is deployed; diversity of camera position; and variety of weather, lighting, and other environmental factors.
Are there limits on the type of submissions that will win Phase 1, for example image datasets versus video datasets?Hailey Day2020-08-24T12:08:17-06:00
No. This challenge does not impose any limits on the ratio of media type (image or video) or the ratio of experiment design (single computer vision application or single impairment) that will be awarded invitation to Phase 2.
Tell me more about failure rate assessment methods?Hailey Day2020-07-23T11:06:52-06:00
Contestants should choose a method to assess failure rate that is synergistic with this challenge’s goal of understanding how camera impairments impact computer vision algorithms. Contestants should apply the reliability measure that is most appropriate to the computer vision application of choice based on the current state of the art. The confusion matrix offers diverse strategies to assess failure rates.
Can you show me an example dataset of images or videos?Hailey Day2020-07-23T11:07:54-06:00
The Consumer Digital Video Library (CDVL) has several image quality and video quality datasets that may help contestants. These datasets assess image quality or video quality and are suitable for NR metric research. However, these datasets ignore computer vision and, as such, do not perfectly match the goals of this challenge. After registering on CDVL, find the dataset by performing a key word search for the dataset’s name: CCRIQ, ITS4S4, or ITS4S3.
The CCRIQ dataset contains a variety of image quality problems from consumer cameras. This dataset uses a conventional experiment design, where scenes are carefully selected for specific characteristics and each scene is photographed with multiple cameras.
The ITS4S4 dataset focuses on one specific impairment: video pans. This dataset contains a mixture of simulated camera pans and real camera pans, recorded with diverse pan speeds and frame rates. The fastest pan speed is based on a bodycam’s erratic movements when the wearer is running.
The ITS4S3 dataset shows diverse camera impairments associated with six simulated public safety scenarios. One of these scenarios (weather and vehicles) explores how rain and snow impact video recordings in city environments.
Why are simulated impairments disallowed?Hailey Day2020-07-23T11:08:29-06:00
Simulated image quality and video quality impairments do not accurately reflect the environments and impairments that public safety systems encounter when deployed. Gaussian noise looks different from camera sensor noise. Video compression performed in real-time on hardware implements different algorithm choices than software compression performed out-of-service.
To maximize realism, this challenge is limited to modern cameras. As a rule of thumb, consider what cameras and which quality problems will be encountered by real systems in 5 to 10 years.
Can our proposal include network packet loss distortions?Hailey Day2020-09-14T09:24:24-06:00
No. This challenge focuses on image and video quality problems that are caused by the camera and must be detected via direct examination of the image or video itself. Advances in network technologies and video streaming strategies over the past decade have made packet loss artifacts less relevant. Computer vision systems that hit the market in 5 to 10 years will most likely ensure that the decoded video stream does not have any artifacts caused by packet loss or other network performance problems. Accordingly, this challenge is focused on impairments that will be relevant in 5 to 10 years.
Can our proposal include software encoders?Hailey Day2020-09-14T09:23:32-06:00
No. Most commercial cameras only provide compressed images and compressed video stream. For the majority of deployed computer vision systems, which this challenge is designed to address, the image or video compression will occur inside the camera. Software encoders and hardware encoders make different algorithmic choices. Their images and videos look subtly different, and the relationship between compression bit-rate and quality differs. Confounding factors include changes in the environment (e.g., the sun going behind a cloud), the camera operator (e.g., shaking hands, framing, slight change in camera position), and camera processes (e.g., optics, auto-focus).
What will my work be used for?Hailey Day2020-09-14T09:24:36-06:00
After the challenge, the datasets will be used for image and video quality research and development (R&D) purposes.
All of the datasets and failure rate methods will be analyzed by PSCR, to understand similarities and differences between human vision and computer vision; to develop NR metrics for computer vision applications; to develop root cause analysis (RCA) algorithms that identify specific camera problems; and to investigate the feasibility of creating much larger datasets, using similar techniques. PSCR expects to publish these findings.
Datasets that are shared on CDVL would enable PhD and MS students to choose thesis topics around this challenge’s goal. These open datasets could also be used by the Video Quality Experts Group (VQEG), to develop improved methods to assess image and video quality for computer vision.
To learn more about prior PSCR research on NR metrics, see the NR Metric Framework GitHub repository and the NTIA/ITS video quality research webpage. A subject matter expert from NTIA/ITS, part of the US Department of Commerce, is serving as technical lead on this challenge.
What are the opportunities for entrepreneurs?Hailey Day2020-09-14T09:24:47-06:00
Entrepreneurs can network with cosponsors, learn from the community, and have an impact on Public Safety. Previous challenge winners have been able to continue public safety research, partner with public safety organizations (PSOs) and learn about future funding opportunities. Visit PSCR’s website for more information on research and funding opportunities with PSCR.
Who is eligible to participate?Hailey Day2020-09-14T09:24:56-06:00
Individuals may enter independently, as groups with other individuals or as teams.
Businesses incorporated and maintaining their place of business in the U.S. may enter as well.
Individuals and Teams:
Must have registered to participate and complied with all of the requirements under section 3719 of title 15, United States Code
Must be age 18 or older and a U.S. citizen or permanent resident of the United States or its territories (in the case of a team submission, at least one member of the team must be eligible and serve as the Official Representative in order to meet eligibility requirements.)
May not be a Federal employee acting within the scope of their employment
NIST Associates are eligible to enter but may not use NIST funding for competing in this challenge, nor are they eligible to receive a cash prize award
Former NIST PSCR Federal employees or Associates are not eligible to compete in a prize challenge within one year from their exit date.
Individuals currently receiving PSCR funding through a grant or cooperative agreement are eligible to compete but may not utilize the previous NIST funding for competing in this challenge
Must not have been convicted of a felony criminal violation under any Federal law within the preceding 24 months
Must not have any unpaid Federal tax liability that has been assessed, for which all judicial and administrative remedies have been exhausted or have lapsed, and that is not being paid in a timely manner pursuant to an agreement with the authority responsible for collecting the tax liability
Must not be suspended, debarred, or otherwise excluded from doing business with the Federal Government.
Individuals or members of a team may only be selected to participate on one contestant group and cannot participate on multiple teams nor contestant categories.
Must be incorporated in and maintain a place of business in the United States or its territories
The submissions will be judged by a qualified panel of expert(s) selected by the Director of National Institute of Standards and Technology (NIST) of the Department of Commerce. The panel may consist of experts from NIST, the Department of Commerce, and industry, who will judge the submissions according to the judging criteria identified in the Official Rules in order to select winners. Judges will not have personal or financial interests in, or be an employee, officer, director, or agent of any entity that is a registered contestant in a contest; or have a familial or financial relationship with an individual who is a registered contestant.
How will I be notified if I win?Hailey Day2020-09-14T09:25:21-06:00
The winners will be notified by email, telephone, or mail after the date of announcement of winning results. Each winner of a monetary or non-monetary award will be required to sign and return to the Department of Commerce, National Institute of Standards and Technology, within ten (10) calendar days of the date the notice is sent, an ACH Vendor/Miscellaneous Enrollment Form (OMB NO. 1510-0056) and a Contestant Eligibility Verification in order to claim the prize.
What rights do I retain for my work?Hailey Day2020-09-14T09:25:31-06:00
Any applicable intellectual property rights to a submission will remain with the contestant. By participating in the prize challenge, the contestant is not granting any rights in any patents, pending patent applications, or copyrights related to the technology described in the entry. However, by submitting a contest submission, the contestant is granting the Department of Commerce, National Institute of Standards and Technology certain limited rights as listed in the Official Rules.
Phase 2 winners may optionally choose to redistribute their submission for research and development purposes on the Consumer Digital Video Library (CDVL). Winners who choose this option will receive a $12,000 prize. See Official Rules for details.
Winners will be featured on the Department of Commerce, National Institute of Standards and Technology website, newsletters, social media, and other outreach materials. NIST may notify winning contestants’ Congressional representatives, if possible.