A significant digit is one that is known to be correct and reliable in the light of inaccuracy that is present in the supplied information, any approximations that have been made along the way, and the mechanics of the calculation itself.
As a rule, the last significant digit that you report in the answer to a problem should have the same order of magnitude as the last significant digit in the given data.
It would be inappropriate to report more significant digits in the answer than were given in the supplied information since that implies that the output of a calculation is somehow more accurate than the input to it.
The accuracy of any number used in estimations or other technical calculations is specified by the number of significant figures that it contains.
A significant figure is any nonzero digit or any leading zero that does not serve to locate the decimal point.
A number cannot be interpreted as being any more accurate than its least significant digit, nor should a quantity be specified with any more digits than are justifiable by its measured accuracy.
The precision of a number is half as large as the place of the last significant digit present in the number.
The factor is one-half because the last digit of a number represents the rounding off of the trailing digits, either higher or lower.
As a rule, during the intermediate steps of a calculation, retain several more significant digits than you expect to report in the final answer.
In that manner, rounding errors will not creep into your solution, compound along the way, and distort the final answer.