You are saying either the instruments are wrong or the effect is real.
I'm saying that the negligible changes in the resistance of a resistor can't be compared to resistance change in a copper cable when power is disspated in it and that a significant resistance change should be observed in a copper cable that's heated up so you can feel it getting warm.
If resistance changes of up to 22% in a Copper cable with current flowing through it (i.e. power being dissipated in it) isn't observed under these circumstances then, in my opinion, a resistance tester isn't doing it's job very well.
Temperature coefficients of resistance are real and there's a higher likelihood of inaccuracies at lower test currents. i.e. instruments have errors (this is documented)- they can be "wrong" - it's whether this is acceptable or not.
Rick said things failed at 1A, i.e. resistance is going up with reduced current and temperature. Generally copper doesn't do this.
What I've said, consistently, is that, based on practical considerations RE: measuring instrument design, at lower currents the accuracy and resolution of poorly-designed resistance testers are
likely to be poorer thus contribute to significant errors - including causing a "fail" at low test currents when it indicates a "pass" at higher currents. This is precisely what I was getting at, in fact.
My tests on resistors showed that the systems were consistent at 1A and 25A - come to that my Fluke agreed and that doesn't provide much in the way of wetting current.
But you've said yourself that current needs to be passed through the resistance under test - a fluke doesn't do this so you're argument is invalid. Try measuring resistance
in a Copper Cable while you're passing a range of currents and you'll see what sources of error there are in practice (the error being the inability of a resistance tester to be able to see the small changes in resistance due to temperature for example).
I've already established that I think they'll be no significant change in the resistance
of a resistor , irrespective of the current passing through it, i.e. the (non-destructive) power dissipated in it.
Resistors don't vary in resistance when they're heated - that's how they're designed - that's the point of my previous posting (you obviously didn't read it or understnad what I was getting at). Copper cables will change their resistance over time, with current.
Bottom line is that we need to understand the measuring devices we're using and sources of errors before we decide whether the resistance values that we're required to record at lower currents, in particular these days, are accurate and repeatable.
When measuring Low Ohms there are sources of error:
At higher temperatures - thermally generated voltages at connections of dissimilar metals within the DUT and at test terminals. Losses in two-wire systems due to voltage-drops with flow of test current. Poor voltage measurement accuracy and/or resolution (especially at lower currents). Poor current source stability (over time/with temperature). Poor contact resistance at the test terminals.