Reduction factor in trackside signalling cables

CABLESCOM, TOGETHER WITH THE ICMA (INSTITUTE OF MATERIALS SCIENCE OF ARAGON) AND THE UNIVERSITY OF ZARAGOZA, DEVELOPED AN INSTRUMENT TO MEASURE THE PERMEABILITY OF STEELS, AND TO OPTIMIZE THE REDUCTION-FACTOR CABLE DESIGNS.

The reduction factor is one of the most important design parameters in current trackside signalling cables, as it indicates the degree of protection the cable has against the electromagnetic interference caused by catenary cables in high-speed train lines.

With the development of high-speed lines and their alternating-current power-supply system through the catenary, ADIF in Spain found that the current that circulated through the 25kV catenary at 50Hz caused interference in the signals carried by the trackside signalling cables. This gave rise to the need to protect these cables using screening designed to reduce this interference. The screening developed is a combination of copper and high magnetic permeability steel shields. ADIF currently defines two levels of reduction factor: 0.3 and 0.1. This means that the cable screens block most of the interference produced by the catenary, allowing only 30% (0.3) or 10% (0.1) of the interfering signal to pass through. Other countries also face this problem, although different voltages and frequencies may cause the interference currents, and therefore other reduction-factor levels are required.

Both screen materials used (copper and high-permeability steel) are expensive elements that have a significant impact on the final cost of the cable. For this reason, at Cablescom, together with the ICMA (Institute of Materials Science of Aragon) and the University of Zaragoza, developed an instrument to measure the permeability of steels, and to optimize the reduction-factor cable designs.

Optimizing the design of reduction factor cables has become increasingly important of late, as raw-material prices are reaching record values.