# mAP: recognition accuracy

mAP is used to judge the accuracy of recognition in target detection, that is, it is used to measure the probability of items being detected, which is related to the following two indicators:

• Precision (accuracy rate): "How many items are real items" detected
• Recall (recall rate): how many items in the data set have been detected

For the above two concepts, put them under the framework of the standard two-class classification problem with the following formula:

```\$\$
Precision =/cfrac{TP}{TP+FP}\\
Recall =/cfrac{TP}{TP+FN}
\$\$```

For the above, there are:

• TP: Positive example, recognized as a positive example
• FP: Negative example, recognized as a positive example
• TN: Negative example, recognized as a positive example
• FN: positive example, recognized as a negative example

For different recognition thresholds, Precision and Recall will change. If you select multiple thresholds (without retraining the model), you can get multiple sets of Precision and Recall. Plot this data as an image. The horizontal axis is Recall, and the vertical axis is Precision, under the curve. The area is the parameter AP

map.png

The average value of multiple tests is the parameter mAP value. The larger the value, the stronger the system performance.

# IOU: Detection effect

Generally speaking, IOU is used to measure the accuracy of the target frame in target detection, which is defined as:

```\$\$
IOU =/cfrac{A/bigcap B}{A/bigcup B}
\$\$```

Among them, A is the frame predicted by the system, and B is the label frame of the data itself. IOU measures the degree of overlap between the predicted frame and the original frame. As shown in the figure below, IOU is the area of ​​the shaded part compared to the entire AB combination.

iou.png

Reference: https://cloud.tencent.com/developer/article/1156246 Target Detection Technical Indicator mAP: Recognition Accuracy Rate IOU: Detection Effect-Cloud + Community-Tencent Cloud