Nope. Not a single percent increase in measurement accuracy. Sorry. This is a famous (and particularly silly) strawman argument fashioned by the 'There is no Global Warming' crowd.
And the reason for this is that mercury have expanded and contracted through heat input exactly as much a hundred years ago as last year. Or the year before. Or today.
The last few years have been the hottest on record. And our records, made with kick-*** reliable mercury thermometers calibrated at the freezing and boiling points of water (which, incidentally, haven't changed over the last hundred-odd years, either) go back a good hundred and fifty/two hundred years. Now what on earth does that tell you? Yeah! There's nothin' wrong! It's all a big hoax!
Err on the side of caution, like my momma always used to say. Because the alternative is unthinkable.
Boer, I don't know what scientific background you have. Thermometers more than 100 years ago were made with mercury. Funny thing, mercury can evaporate. Thermometers break and mercury gets out. So then they started using alcohol in thermometers (and most thermometers in the developed world were exchanged for these new ones). Then along came the personal computer with thermistors which are accurate down to the thousandth of a degree. These coupled with wireless communication made it possible to place electronic (not mercury) thermometers all over the place as long as they had a power line and communications equipment to transmit these things.
Now you ever use a mercury, alcohol, and digital thermometer in a hot water bath? Which one are you going to trust to be accurate? Since every grade schooler in the US has done this experiment and realized that the glass tubes holding the liquids can move up and down according to the scale just by picking the thermometer up off a shelf, they all learn to use the digital. Then when they learn how easy it is to calibrate the digital by running a simple piece of software, and that they can get an accurate reading down to the thousandth of a degree, verses maybe an inaccurate tenth using a liquid based thermometer, they never go back.
Now besides the innacuracies of the measuring device, let's consider number of devices and locations. 100 years ago, how many locations were attempting to record an accurate temperature? Were these recording the temperature of the ground or the air? Were they dry or were they being affected by relative humidity? Were they in urban environments only or were they spread equally all across the surface of the earth (land and sea)? You go draw up a bath of water and vary the temperature while your filling it and stick one thermometer in after you are done. Is that thermometer going to accurately read the temperature of the entire bath? NO. Just one single point. Will data based on that point represent the trend of the entire bath? NO. Just one point.
How do you compare data from the past 100 years when you keep adding new points, methods, equipment, etc.? You try to find a datapoint for which these things has not changed.
So they went to the poles. What did they find? Not a temperature, but a trend between thickness in the ice and CO2 trapped within that band of ice.
Well that's a start, now we have a good idea
that temperature of the air and thickness of the ice are related, and we have a strong correlation
between thickness and CO2, and we believe that we can say that the CO2 levels are higher now than they have been in a very long time (again how long is questionable) and we can begin to say that there might be some relationship
between that and industrialization. Except that industrialization began more than 100 years ago, and the temperatures only appear to have begun rising in the past 30 years and before that they were actually falling enough to make some people think we were entering another ice age (look at those people now and shake your head shamefully). So now someone needs to provide data on whether industrialization 30 years ago suddenly drastically increased in production of CO2.
Oh and don't even try to say that it took a while for the increase in CO2 to kick in. The study of the 2 mile deep core sample in Vostok shows that there is a strict relationship between air temperature and CO2 levels (not an offset of 70 years relationship but a strict relationship meaning every hot year there was an abnormally high amount of CO2).
Now take into consideration the effects of the sun on Mars on the movement of the dry ice from one pole to the next and the rise and fall of CO2 levels in the atmospher when this happens. On the hottest years more CO2 is realeased into the atmosphere, thus the following year more is frozen into the ice because there is more in the atmosphere to freeze out. Seems like CO2 from an ice core sample should follow the hot years (of course on earth we don't have dry ice at the poles, we have gas bubles in the ice.)