Modern color TVs are quite robust compared to the early models, which used CRTs which were
round rather than rectangular as with modern tubes. In those days, the internal circuitry was based around
vacuum tubes (the CRT itself *is* a vacuum tube, of course, but it's only a display device rather than an active part of the TV's circuitry), which run at high temperatures and high voltages, eventually causing the failure of not only themselves, but the various components around them (other things can cause them to fail, such as internal degradation, but the heat from those tubes probably doesn't help). Also, if you moved an early color TV from one location to another, the color alignment of the CRT often would be thrown off-kilter, requiring that internal adjustments be made (known as 'purity' and 'convergence', among other things). During the '70s, the advent of sets using solid-state circuitry (and the eventual elimination of vacuum tubes from the internal circuits), an improvement in the designs and materials of the various types of components involved within, and the perfection of cathode ray tube technology meant that color sets were much more stable, and didn't need repairs nearly as often.
As for the OP's question, I agree with what others have said about Sony using start-up circuitry. We have a 27" Sony Trinitron set (circa 2006) in our living room, and when it's first turned on, you hear the high voltage kick on, and a red LED next to the power button starts flashing for several seconds. Once it's done with various start-up tests, the LED goes out, and the screen image comes up fairly quickly. If the set somehow fails these start-up checks, the screen will refuse to come up, and the LED will flash a code identifying what part of the circuitry failed to pass its test. Hope this helps.
-Adam