Your Position: Home > Machinery > Metrology

Metrology

Author: Liang

Feb. 04, 2024

282

0

Tags: Machinery

PRECISION METROLOGY

 

 

 

Introduction

 

Metrology is the scientific study of measurement.  One cannot embark on the pursuit of precision manufacturing without an equally passionate journey into the challenges (and perils!) of precision metrology.  So this document is intended to provide a brief introduction to and overview of this complex subject.  Here is a printed version of these notes.

 

 

 

Precision Metrology is Hard Work!

 

The sooner you accept the wise words of Israelle Widjaja, that “properly measuring things is hard,” the sooner you’ll begin to understand how to make accurate and precise measurements.

 

 

 

Rule of Ten

 

The Rule of Ten (or Rule of One to Ten) states the discrimination (resolution) of the measuring instrument should divide the tolerance of the characteristic to be measured into ten parts.  In other words, the gage or measuring instrument should be at least 10 times as accurate as the characteristic to be measured.  Many believe that this only applies to the instruments used to calibrate a gage or measuring instrument when in reality it applies to the choice of instrument for any precision measuring activity.  The whole idea here is to choose an instrument that is capable of detecting the amount of variation present in a given characteristic (i.e. part feature). 

 

To achieve reliable measurements, the instrument needs to be accurate enough to accept all good parts and reject all bad ones.  Conversely the gage should not reject good parts nor accept bad ones.  The real problem arises when an instrument is used that is only accurate enough to measure in thousandths and accepts parts based upon that result and the customer uses gages that discriminate to ten-thousandths and rejects parts sent to them for being 0.0008” over or under the specification limit.

 

Practically speaking, this means to reliably measure a part feature specified as +/- 0.0005” requires a measurement tool with a resolution and an accuracy of 0.0001”.

 

 

 

Accuracy, Precision, and Reproducibility

 

Accuracy refers to how close a measurement is to a true (actual) value or a value accepted as being true.

 

Precision is a measure of the spread of different readings (i.e. repeatability), and is completely unrelated to accuracy.

 

Reproducibility is the degree to which a measurement can be reproduced or replicated by someone else working independently.

 

 

Got Calibration?

 

A measuring instrument is useless if not calibrated regularly against a reliably calibrated gage.

 

 

 

Constant Force

 

A measuring instrument which offers no constant contact force method of measurement can never have the same level repeatability or reproducibility as one that does.  In addition, a measuring instrument that does provide constant contact force only works properly if the clutch or ratchet is rotated at consistent velocity, so technique still matters.

 

 

 

NTP

 

Proper measurements should always be conducted as close to NTP (normal temp and pressure) as possible (68°F & 1atm (14.7 psia)).

 

 

Be Careful!

 

Whenever possible, measure in an environment that will not damage the part or measuring instrument if either is dropped.

 

Never touch precision ground surfaces (i.e. gage blocks, gage pins, calibration rings, precision measuring surfaces, etc.) with your bare hands, as doing so will cause them to rapidly corrode, ruining their accuracy.  Always wear gloves, remove any anti-corrosion protectant with WD-40 and a new blue shop towel, and reapply anti-corrosion protectant (LPS) after use.

 

 

Never force any measurement instrument.  If a caliper or micrometer won’t move freely, investigate why; most have a locking screw or cam, so check that it’s not tight before damaging the instrument.

 

 

Cleanliness is Key

 

Clean the contact jaws or tips with alcohol and a piece of tissue paper or a blue shop towel before use.

 

 

Got Zero?

 

Always remember to double check the zero of the measurement instrument before use.  This seems fundamental, but it’s surprisingly easy to overlook when paying attention to so many other things.  This means you will need to have calibration gages or standards for instruments which are not self-zeroing (like a 0-1” micrometers).

 

  

 

Thermal Growth

 

Understand metals have a typical coefficient of linear expansion of 0.000010 in / (in-°F); therefore holding on to a measuring instrument and/or a part long enough will cause a 4″ nominal part to change length 0.0012″ due to temperature change alone (0.000010 in / (in-°F) x (4 in) x (30 °F) ≈ 0.0012 in)!

 

For this reason you should always (well, whenever practically possible) use an indicator stand to hold a precision measuring instrument and protect it from thermal growth due to body temperature.  In addition, you should always allow adequate time for the part(s) being measured to reach NPT.

 

 

Multiple Measurements

 

Always take at least three measurements to be “carelessly certain” of the ballpark value.  The deviation between these measurements should match the confidence you are seeking for the repeatability of your measurements.

 

 

Gage Blocks and Gage Pins

 

Become proficient with gage blocks and gage pins, as these are typically manufactured to ±0.000100″ or ±0.000050″ (depending on their grade rating), and are good for moderate precision calibrations.

 

When using them, always wear gloves, work over a safe surface in case you accidentally drop one (never over the open box!), and coat them with rust inhibitor (LPS) when finished.

 

 

 

LEFT: Instructions on how to use gage blocks (click image for video).  RIGHT: Use and care of gage blocks (click image for link).

 

 

LEFT: Applications of gage pins (click image for video).  RIGHT: Example of gage pin set.

 

 

LEFT and RIGHT: Using gage blocks to calibrate a micrometer and bore gage.

 

 

Abbé and Parallax Errors

 

Research Abbé error and parallax error to understand why calipers are not regarded very highly in metrology circles J.

 

Abbé principle states: “Only when datum lines of measuring system and measured workpiece are on the same line, is a measurement most accurate.”  As drawing shows, when there is distance (h) between measuring faces and reading axis line, there will be measuring error (ε = b-a = h tan θ).  Therefore, measuring force and tool distortion must be taken into accounted during such measurement.  Think about what happens when the jaws of a dial caliper are zeroed by bringing their flat surfaces into contact, and then a measurement is made without the jaws in flat contact against the artifact.

 

 

 

LEFT: Proper method of calibration using a length standard; RIGHT: Additional Abbé error introduced because of location of applied measurement force.

 

 

Parallax error is a perceived shift in an object’s position as it is viewed from different angles, and it is inherent in virtually every analog measurement. 

 

 

Parallax error when reading a linear scale, as on a caliper (left) and when reading a vernier dial, as on a micrometer (right).

 

 

Indicators

 

Since I already have a document on indicators, I will simply include the link here.

 

Rugmini Dinu

9 October 2023

507

3 mins read

The ‘10 to 1 rule’ in metrology is a crucial principle that underscores the importance of precision and accuracy in measurement. This rule mandates that the precision of a measurement instrument should exceed the desired accuracy by a factor of ten.

Introduction

In the area of metrology, the twin principles of precision and accuracy hold paramount importance. Precision refers to the degree of repeatability or consistency in measurements, while accuracy signifies how closely those measurements align with the true or desired values. Striking a balance between these two factors is critical for ensuring trustworthiness in metrology, as they collectively determine the reliability of measurement results.

The selected measuring tool must not only distinguish but also be precise to a tenth of the specified tolerance. For example, if the feature has a 0.010″ tolerance, the measuring instrument’s precision should be at least 0.001

What is the 10:1 Rule?

The 10 to 1 rule is a fundamental concept in metrology that underscores the relationship between precision and accuracy. This rule stipulates that for a measurement system to be considered trustworthy, the instrument’s precision should be at least ten times better than the desired accuracy. In other words, if you aim for a certain level of accuracy, your measurement instrument should be capable of delivering results with precision ten times finer than that accuracy requirement.

The selected measuring tool must not only distinguish but also be precise to a tenth of the specified tolerance. For example, if the feature has a 0.010″ tolerance, the measuring instrument’s precision should be at least 0.001.

Ensure compliance with Legal Metrology

Ensure compliance with. Register online today for precision in your business operations

Origins of the 10 to 1 Rule

The 10 to 1 rule’s origins can be traced to the fundamental need to ensure the accuracy and reliability of measurements. In metrology, the goal has always been to obtain measurements that are as close as possible to the true or desired values. However, this is often challenging due to various factors that introduce errors and uncertainties in the measurement process.

To address these challenges, the 10 to 1 rule was conceived as a practical guideline. It recognises that measurements can be influenced by a range of factors, including variations in the measuring environment, fluctuations in instrument performance, and even random errors. By stipulating that the precision of a measurement instrument should be at least ten times better than the desired accuracy, the rule aims to create a buffer that helps mitigate the impact of these factors.

Significance of the 10 to 1 Rule

The significance of the 10 to 1 rule lies in its ability to significantly enhance the reliability and consistency of metrological data. When an instrument adheres to this rule, it essentially means that the instrument’s precision far exceeds the level of accuracy required for a specific measurement task. This deliberate over-precision is a strategic approach to safeguarding measurement integrity.

By adhering to the 10 to 1 rule, measurement instruments become less vulnerable to systematic errors and uncertainties. Systematic errors are consistent errors that affect measurements in the same way every time, and they can be caused by factors such as instrument calibration, environmental conditions, or operator technique. By ensuring that precision surpasses accuracy by a factor of ten, these systematic errors are effectively countered, leading to more dependable and trustworthy measurement results.

The rule’s significance becomes particularly evident in industries where precision and accuracy are paramount, such as manufacturing, scientific research, and quality control. In these fields, even slight errors in measurements can have significant consequences, ranging from subpar product quality to inaccurate scientific findings. The 10-to-1 rule acts as a critical tool for minimizing these errors and maintaining high standards of precision and accuracy.

Application of the 10 to 1 Rule

The application of the 10 to 1 rule is widespread across various metrology scenarios. In laboratory settings, where scientific experiments demand the utmost precision, scientists and researchers often go to great lengths to select measurement instruments with precision levels well beyond what is strictly required for their experiments. This practice of over-precision is a proactive measure to ensure that the results obtained are not compromised by potential instrument limitations or environmental factors.

Similarly, in the realm of manufacturing, engineers and quality control experts employ the 10 to 1 rule when selecting appropriate measuring tools. Ensuring that the precision of these tools significantly exceeds the required accuracy is crucial for maintaining product quality and consistency. It helps prevent defects, rejects, and variations in product specifications, ultimately saving time and resources while upholding high manufacturing standards.

Conclusion:

The 10 to 1-rule in metrology plays a pivotal role in maintaining the integrity and reliability of measurement data. By emphasising that the precision of an instrument should exceed the desired accuracy by a factor of ten, this rule helps safeguard against errors and uncertainties, ultimately contributing to the trustworthiness of measurements. Whether in scientific research, manufacturing, or any other field that relies on precise measurements, adherence to this rule is a cornerstone in ensuring accuracy and consistency, ultimately enhancing the quality and reliability of metrological data.

Read More:

 

Metrology

What Is the 10 to 1 Rule in Metrology?

Comments

Please Join Us to post.

0

0/2000

Guest Posts

If you are interested in sending in a Guest Blogger Submission,welcome to write for us!

Your Name: (required)

Your Email: (required)

Subject:

Your Message: (required)