Thanks !
Printable View
Thanks !
Link here, from Google:
https://www.google.com/search?q=spee...hrome&ie=UTF-8
The accuracy is 100%, since we have defined the length of a metre in such a way that the speed of light is exactly 299 792 458 m/s.
So more precise measurements of the speed of light simply result in more precise measurements of the length of a metre.
Grant Hutchison
I wondered why the speed was stated with no comments on precision in the previous link.
The link for the Wiki article on the Metre is here:
https://en.wikipedia.org/wiki/Metre
Back when we were measuring the speed of light, rather than the length of a metre, the uncertainty was around one part in a billion--because that's the threshold below which it makes sense to define the speed of light in terms of a round number of metres. I don't know if anyone has attempted to better that. There aren't many practical applications for measurements with <10^{-9} precision.
Grant Hutchison
Here is a web page that gives some explanation of how it's been measured.
http://math.ucr.edu/home/baez/physic...08%20km%20away
It says that in 1972 it was measured with a precision of a tenth of a millimeter per second.
I'm actually surprised we don't need the meter defined to ten or twelve places. So often science and technology need extreme precision.
When I did a surveying class, we took into account the expansion of a steel tape measure from it's calibrated "standard". Basically, the humidity and temperature difference from "standard" causes a change of 1 part per 29 million per degree or unit of humidity. First, we got the lesson on significant figures. Then we had to define a space we measured using the data collected in the field by hand. The result is 25 hand written pages of long division. If you skipped over the lesson on significant figures, it was 25 pages of long division done with wrong numbers. Since the proof was so easy, it was a great way to make yourself nuts. When your proof didn't match, you had to go back through a bunch of really trivial math to see where you forgot what you were told about significant figures.
(At the time, the computer software we had would clip off at 4 places past the decimal. With a physical ruler and eyeballs, four places past the decimal is wildly outside "reasonable". This was the purpose for doing stuff by hand, so we learned about limitations and reasonable.)
For practical applications, you can get incredible precision in your tools, but the human connected to the tool to it isn't that good. Surprisingly, we could actually detect the difference between who was holding the steel tape for a particular measurement with just eyeballs and pencils. Some people pull harder and do it consistently so it is detectable.
Back to the OP, you can do all of this now with a laser DME. It's still subject to the human holding it.