As for the whole amp/hr thing, an electric motor is an electric motor, and Ohm's law applies universally. Let's throw some arbitrary numbers out there just for the purpose of discussion.
Say you have a 1hp motor that draws 10amps at 12v, and you run it on a 100 amp/hr battery. It will last 10hrs.
Now lets take a 1hp motor on 24v, all other things being equal it will draw 5 amps (double the voltage and the amp draw cuts in half). Now we need 2-12v batt's in series to make 24v, right? Run that 1hp motor on 2-12v 100amp/hr batt's wired in series to make 24v, and yes you get twice the run time on a charge (20hrs).
Put 2-100amp/hr batt's on that 12v wired parallel and there is literally no difference. 200 amp/hrs with a 10amp load and you get 20hrs run time.
Now double the size of that motor to 2hp, at 12v the amp draw would be 20amps, and at 24v it would be 10amps. Same law applies when calculating amp/hrs vs run-time.
The reason why the manufacturers step up the voltage is to keep amp draw down, resulting in a smaller required wire size, and smaller circuit protection. It simply keeps things in a more workable window for end use.
All other things being equal, there is no performance advantage of a 24v 1hp motor over a 12v 1hp motor.
Now let's look at real world application. If using a 55lb tm, and you run it in high speed all the time you are using its max rated current. Upgrade to an 80lb motor, which is roughly 1.5x the thrust, and you can most often get away with running it at less than high speed. This is where the extended run time would come from. Of course, that's with all other things being equal.