I looked at some previous posts on this topic, but I didn't really see anything useful.
Basically, the subject divides people into two camps:
The people who run just one game on a 100A breaker and continuously hose it down with a fire extinguisher just in case of FIRE!! (Wear a flame suit! FIRE!!)
The people who run 30 games on one frayed extension cord that their Gram-Gram bought in 1957. (They wrapped the part that the dog chewed on with electrical tape.)
I have a game room that I built in my basement that has a MAGNIFICENT number of outlets, one every 5 feet or so. Because I hate extension cords.
There are THREE 20A breakers powering the room, about 6 outlets on a breaker.
The one that powers the back wall has FOUR bally-williams 1990s pins and two arcade games on it. (Plus some shelf speakers, but they only draw 0.15 A.)
As an experiment, I hooked up everything to one power strip and I had the pins do their "hunt for missing ball" routine. The highest amperage I could get was about 10.5 Amps for the whole sheebang. That was measured using a KILL-O-WATT meter.
Seeing as how a 20A circuit is good for 16 Amps draw, why can't I turn everything on and off at the 20A breaker in the fuse box?
A lot of the other threads on this topic say that there will be a MASSIVE SPIKE of power draw as the games come on, but I'm not seeing how that could be. I don't see any spike when the games come on, they seem to increase amperage slowly as they come on.
I tested one of those huge electric room heaters and a hair dryer, and they both each pulled more amps separately than all the pins & games did together.
So if turning on a room heater that draws 13 steady amps is OK, then how come turning on a wad of games that draw 10.5 amps isn't OK?
What did they used to do at old coin op arcades? Walk around and individually power on 50 games?