Hello,
I want to know in serial transfer of data, for example, between two computers at a distance of three meters. When one byte is transfer, is it true that one bit is transfered through the line until it reach other computer and then next bit will be transmited?
I mean that when the first bit is go through 3m line and the received computer has received it, then next bit will be sent?
In synchronous transfers it is pssible under certain circumstances to send a single bit because the rate at which it is sent can be controlled by the receiving end. In other words, if there were several bits, it could receive one then pause the rest because it could have control over the clock that times the transfer. It is unusual to do this though.
Asynchronous transfers are usually done in blocks of 8 bits (one byte) although the actual number of bits can be anything you like. They are preceded by a start bit and end wih a stop bit so there are generally two more bits per transfer than there are data bits. The bits are sent at a pre-determined rate (the Baud rate) so the receiving end knows that when it recognizes the start bit, the following bits should be sampled at fixed intervals.
Both methods can be used over long distances but synchronous transfers need an extra wire to carry the clock signal so it is more common to use async instead.
Most UART devices, either as single components or built-in to a microcontroller, can be used in both modes and have circuits in them to detect the start and stop bits and generate the necessary timing signals.