There are three terms often used in precision practices and they are often used incorrectly or in a vague manner. The terms are accuracy, repeatability, and resolution. Because the present discussion is on machining and fabrication methods, the definitions will be in terms related to machine tools. However, these terms have applicability to metrology, instrumentation, and experimental procedures, as well.
Accuracy is the ability of a machine to move to a commanded position which it has not visited before. This implies the machine must calculate the new position in terms of its feedback system parameters, or lack thereof, and then move to that position. This does not mean the machine is "shown" or "taught" the position and the feedback parameters are stored, as is often done in robotic applications. Accuracy brings the entire machine, hardware and controls, to bear on the task at hand and is therefore a demanding specification placed on the machine.
In the area of measurements, the quantity to be measured is never known therefore instrument accuracy is extremely important. Here, accuracy is the ability of the instrument to provide a quantitative value for the unknown parameter. This has implications beyond that of accuracy in a machine tool where the "unknown" is the true position of a cutting tool, for example, which was computed and where the phenomena (the machine tool performance) should be known and understood with some level of confidence. In instrumentation, the resolution of the output (how many decimal places are shown) is often mistakenly taken as the instrument accuracy. The accuracy of the instrument is best quantified by comparison with fundamental standards and careful calibration .
Repeatability is the ability of the machine to re-visit a location and has other implications including from which direction is the movement made. If the point is repeatedly approached from the same direction, the term used is repeatability. If the point is approached from two directions, such as the work table on an ordinary milling machine, then the term is bi-directional repeatability. Good bi-directional repeatability is more difficult to achieve than repeatability because it involves hysteresis of mechanical motions, among other possible factors such as feedback deadband.
In instrumentation measurements, the repeatability is again perhaps more complex than in machine tools. The quantity to be measured with an instrument is always influenced by the presence of the instrument and this can change the unknown parameter, especially over time which can have a significant effect on repeatability. This is not to say that metrology tools do not affect machine tools in a similar way. In the figure above, the distribution of the data points is due to repeatability errors associated with the phenomena and the instrument.
Resolution is the least increment of movement the machine is capable of making. Because machines use digital controllers and motors with discrete feedback, such as encoder slits or interferometer fringes, the resolution represents the quantity upon which all motions are made. If the system resolution is n, then all motions are integer multiples of n. This is not to imply the resolution is the least of any system component but rather the largest. A machine controller which can calculate a movement of 0.1 micrometers will not be able to reliably move the machine an increment of 0.1 micrometers if the feedback encoder only has a resolution of reliably 1 micrometer. The control system will never "see" such a small movement. Therefore, one must be very careful in interpreting machine performance figures.
As mentioned above, resolution is the least count of output of an instrument. Because instrumentation is mostly digital, it is very easy to display with many insignificant digits and the user must be very careful when interpreting such data. If a temperature recorder, for example, outputs to one-hundredth of a degree, there is no assurance that it is reporting the correct temperature! One may know the reported temperature with high resolution, but it may be the entirely wrong temperature! Equipment manufacturers and users often use the term "precision" to describe resolution which can lead to a false conclusion about the instrument accuracy.