Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store

The measure of accuracy is:
(A) Absolute error
(B) Relative error
(C) Percentage error
(D) Both B and C

seo-qna
SearchIcon
Answer
VerifiedVerified
449.4k+ views
Hint
Accuracy is measured as the deviation from true reading compared with the observed reading. Accuracy of a measured value tells how close the measured value is to the actual value.

Complete step-by-step solution
Absolute error is the deviation of a reading from the actual or true reading. Relative error is defined on true error, as the difference of a value of the true or actual value divided by the actual value.
${\text{Relative error = }}\dfrac{{{\text{observed value - true value}}}}{{{\text{true value}}}}$
When percentage error is expressed in terms of percentage, i.e. by multiplying the relative error by 100, it is called percentage error. Both relative and percentage error give a quantitative analysis of the deviation from the true value. Accuracy however is defined as the closeness of the measurement to the actual or true value. Both relative and percentage error can give the measure of accuracy. For example in 2 separate experiments with true value 1000m and 1m have absolute error as 0.1m. Now the relative error in these cases will be 0.999 and 0.9. So absolute error fails to tell us the correct measure of accuracy and relative or percentage error takes the win.
Therefore the correct answer is option (D).

Note
Accuracy and relative error are inversely proportional to each other. As the accuracy of a value increases, its deviation from the true value or absolute error decreases.