Skip to main content

Table 4 11-point precision diagram. This example shows a query that is submitted to two different IR systems (IR1 and IR2), which are based on the same collection of 20 documents. Both IR1 and IR2 rank all 20 documents, of which 10 are relevant. However, IR1 ranks the relevant documents higher on average than does IR2. The mean average precision for IR1 = 0.79 and for IR2 = 0.40. The recall and precision curves for IR1 and IR2 are shown in figure 2.

From: A tutorial on information retrieval: basic terms and concepts

Ranking by IR1

Ranking by IR2

Ranking

Doc

Relevant

Recall

Precision

Ranking

Doc

Relevant

Recall

Precision

1

d 1

yes

0.10

1.00

1

d1

no

0.00

0.00

2

d 2

yes

0.20

1.00

2

d2

no

0.00

0.00

3

d 3

yes

0.30

1.00

3

d3

no

0.00

0.00

4

d4

no

0.30

0.75

4

d4

no

0.00

0.00

5

d 5

yes

0.40

0.80

5

d5

no

0.00

0.00

6

d6

no

0.40

0.67

6

d 6

yes

0.10

0.17

7

d 7

yes

0.50

0.71

7

d 7

yes

0.20

0.29

8

d8

no

0.50

0.63

8

d 8

yes

0.30

0.38

9

d 9

yes

0.60

0.67

9

d9

no

0.30

0.33

10

d10

no

.060

0.60

10

d 10

yes

0.40

0.40

11

d 11

yes

0.70

0.64

11

d11

no

0.40

0.36

12

d 12

yes

0.80

0.67

12

d 12

yes

0.50

0.42

13

d 13

yes

0.90

0.69

13

d13

no

0.50

0.38

14

d14

no

0.90

0.64

14

d 14

yes

0.60

0.43

15

d 15

yes

1.00

0.67

15

d15

no

0.60

0.40

16

d16

no

1.00

0.63

16

d 16

yes

0.70

0.44

17

d17

no

1.00

0.59

17

d 17

yes

0.80

0.47

18

d18

no

1.00

0.56

18

d 18

yes

0.90

0.50

19

d19

no

1.00

0.52

19

d19

no

0.90

0.47

20

d20

no

1.00

0.50

20

d 20

yes

1.00

0.50