I still stand on my comment that the ESC's you tried never got properly calibrated so they never would arm. Radio's even from the same make and model will vary the timings needed to perform a calibration. Since we can't rely on perfect signal timing... we start with setting the transmitter at it's own 0% and 100% positions and start to attempt our calibrations from there. If that doesn't work, you lower or increase the 0% side up or down until the ESC does see the proper signals.
Since different manuals may use different wording for "calibrate" ... I'll also describe it as setting the high and low end points of the range of the signal... ie the 0% signal and the 100% signals. Then add to the situation, that each ESC can use slightly different methods for calibrating them, adds some complexity to the situation. Next add the inaccuracies to the technology ( which is why calibration is needed to compensate for ), and you get this situation... where people think something isn't working when they are and it just appears as a bad part to them.
Warning, I'm about to get real technical... and still not technical enough to completely explain this accurately... but it will be the basics. Not to mention I may make a poor choice of words ( like saying length where the word width may be the proper term as an example ).
Your friends multimeter is not going to be good for testing the required microsecond signals on the signal lines going to the server/ESC. Most ( even expensive ones ) just measure average voltage for the line and not true RMS ( root square mean ) which the PWM ( pulse width modulation ) signal produces. Of the multimeters that I have seen that do test true RMS do it at a max rate of 6000 per second... the signals we are talking about are in µs ... aka microseconds which are in millions of times per second. So to properly measure the signal would require an oscilloscope with the sampling rate high enough to measure them. These packets of info are updated at the rates of either 11ms or 20ms ... ms is milliseconds or thousands of times a second.
So you get a signal pulse width that is measured in µs that gives the servo where ( what angle ) to place the arm of the servo... and this is updated at a rate of either 11 or 20ms per second. A servo isn't actually digital in the normal sense of the word, it's analog. Because the width of the individual pulses appear as a frequency to the servo, it knows what angle to place the servo arm. The frequency shows as about 40Hz to 200Hz. Said another way, a pulse width of .5ms, the servo arm is at its far left position, a pulse of 1.5ms will be the middle position and at 2.5ms is at it's far right. An ESC is simply the digital version of the servo basically that then translates that to the signal it outputs to a motor.
It is this fine output of and measuring of the microseconds that lead to this situation. Nothing is perfectly accurate so the widths of the signal pulses vary slightly from radio to radio even when they are the same make and model. The servos can work because they are reading at an analog level... the ESC's on the other hand are reading it digitally and then translating that signal to an analog PWM signal the motor understands. Since our measurement of the pulses can't be set or read perfectly accurately, we use the calibration step to tell the ESC where those values are really at.
Machinists have the same situation but in measuring the size of things exactly as they can. They have to purchase a set of exact size measuring pieces to use as their standard so all of their measurements they make are calibrated from those standards... but as the machinists know, they are only as accurate as the set of standards they purchased and the tools they use to measure with..