Difference between I2c Protocol and UART protocol.

Status
Not open for further replies.

Tan

Full Member level 4
Joined
Jul 23, 2006
Messages
216
Helped
10
Reputation
20
Reaction score
3
Trophy points
1,298
Activity points
2,742
Uart is asynchronous so it requires a start bit to go low in order to indicate that data is getting transferred.I see start bit going low even in i2c which is synchronous.
Can anyone throw light on this please?
 

Uart if used in NULL modem mode does not wait for the acknowledgement. In UART its peer to peer communication, whereas in I2C its general broadcast. the clock in I2C is always generated by master only and all the slaves are dependent on the master clock for communication. In UART, each device has its own clock. We call I2c as synchronous because the software generates the clock and it generates when the startbit is issued. UART on other hand depends on the processor clock.

Aynchronous and asynchronous are not defined by clock going LOW or high , But the way they are generated and used.
 
What I meant was,in uart protocol start bit and stop are mandatory in starting the operation as the data does not depend on clock. but in i2c as it is synchronous why is start bit going low?it can directly send the data to slaves right?
 

As per I2C All the slaves are in idle mode when there is no communication. And since each slave has unique address, and in idle state, the slaves have to be woken up by a commone signal that is startbit, this indicates the slaves to be in active state and knows that the master wants to communicate. if this is not sent the slaves will not respond to the master call.. So direct talking to slaves is not possible until you wake them up from idle mode.
 
Hi Tan,

the I2C is a Philips or NXP standard, it was define like this and if you want to mention your design is I2C compliant, you must follow the I2C protocol, that's it. If you think this standard is not efficient you could propose a new one
I2C is very well design to reduce the number of pins for a multi-master & multi-slave protocole, with a reduce gate count.

And generaly, the design over sample the I2C clock like the data, to illiminate one clock domain, that's add more issue than solve, to used directly the I2C clock.
 

in fact, we use I2C for multiple usage , such as ATE pattern input !
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…