That's a good question and shows that you are trying to think analytically, which is very good. We don't usually speak of "slowing down" a signal, except that in coaxial cables (and other types of electromagnetic "transmission lines") the propagation speed will usually be somewhat slower than the speed of light, which probably just confuses things when asking a question involving cables, unless that effect is what you meant to ask about.
But I think that you probably meant that you were wondering if the capacitance might lower the amplitude, depending on the frequency, like a low-pass filter. Is that correct?
Typically, the capacitance between the shield and the center conductor of a coaxial cable is extremely low, at any given point. Frequencies that are high-enough to be affected by it don't "see" the combined total capacitance of all of the metal in the center conductor and the shield, of the whole cable. They only get affected by the impedance in their immediate vicinity. That impedance is more-complicated than just the capacitance between the shield and the center conductor.
If you are talking about microwave-frequency signals (and also lower frequencies, to a lesser extent), then yes, there could be some significant attenuation from capacitance (especially when combined with some resistance and inductance, as it would have to be), even if the capacitance was very, very small. And the attenuation could become more significant as the frequency increased. But I don't think that it's only (or even mainly) due to just the capacitance between the center conductor and the shield. You would have to use a complete electromagnetic model, to understand what is happening at those high frequencies (Maxwell's Equations), and also look at everything as "distributed" resistance, capacitance, and inductance, instead of as "lumped components" like in a schematic, and also take into account the dielectric characteristics of the material between the two conductors, and even what is outside of the shield, and the conductors themselves, and the cable geometry, etc.
Above a gigahertz or two, the losses that the signal suffers when going through even very-high-quality coaxial cable can become quite high, and waveguides are usually used instead of wires.
I am sorry but I cannot understand your second question. How do you measure wire "width" in tracks, and why?