Jump to content
Science Forums

Possible causes for IR data lag...?


Recommended Posts

Just wondering...

 

Laptops have been issued with infrared ports for ages, but in the last few years fewer and fewer models seem to ship with it.

 

And mostly because its slow. Ghastly so.

 

Why?

 

I guess one (or a few) of the following must be the reason:

 

  1. The established IR protocols are ancient, and cannot handle high-speed data transfer.
  2. Interference plays a major role. Placing a hot cup of coffee within visible range of two IR ports plays havoc on data integrity, and causes packets to be continually resent, slowing things down badly.
  3. The emitting diodes simply don't switch on and off quick enough for high-speed transfer.
  4. It's uncomfortable setting up line-of-sight between two models from different manufacturers who insist on placing the emitters/receivers on different places. One might put it on the left, the other might put it on the back.

But that makes me think:

 

  1. Surely it can't be a protocol issue, because you can adapt any protocol to run over any device that simply switches on/off?
  2. Interference can't be the main cause, because fine-tuning the used frequency away from common sources (including appliance remote controls, etc.) can't be impossible? Also employ a polarised filter over the receiving lens, and you'll only receive signal intended to go through the filter. Scattered IR from random sources like coffee mugs will be filtered out.
  3. Surely it can't be that materials science can't come up with a high-speed IR diode? Optic fibre is one of the fastest transmission vehicles, and they are solely dependent on LEDs switching very fast between on/off? So, the diodes exist - at visible frequencies, granted - but can it be that switching between on/off at IR frequencies can't be speeded up? I guess - because at the heart of infrared transmission lies temperature, the diode must warm up and cool down, in effect... so that's a distinct possibility.
  4. Line of sight comfort issues might be a problem, but that only explains why they might remove IR ports from the market, not why its so terribly slow.

So I guess if anything, IR diode response rates is the biggest problem. What say you?

Link to comment
Share on other sites

Why use IR when you've got Bluetooth, with much higher data rates, no line of sight problems and protocols that will support real-time transmission?

 

Though the motor-driven vehicle steadily increases in numbers and availability...the horse market does not show the slightest effects of the automobile, the demand being as great and the prices as high as before the automobile came into use, ;)

Buffy

Link to comment
Share on other sites

And mostly because its slow. Ghastly so.
Why use IR when you've got Bluetooth, with much higher data rates, no line of sight problems and protocols that will support real-time transmission?

Exactly.

 

But why does it have much higher data rates? And any protocol can run over anything that can change between two binary states. An IR diode and be on or off. The protocol is not the issue. Line of sight isn't the problem, because IR is SLOOOOOOOWWWW, regardless.

 

I wanna figure out why.

 

As a mental exercise.

 

I don't intend to rally support for reviving a dead technology, personally, I hated IR.

 

Because it was SLOOOOOOOOOOWWW.

 

It's just interesting to figure out why it was so SLOOOOOOOOOWWWW.

Link to comment
Share on other sites

But why does it have much higher data rates?

...

I wanna figure out why.

 

As a mental exercise.

...

It's just interesting to figure out why it was so SLOOOOOOOOOWWWW.

Oh Its just the *usable* frequency of the carrier! Higher frequencies on the EM spectrum *generally* can carry far more information than lower ones in a given amount of time, however this is strongly ameliorated by the efficiency of the transmitters and receptors in the link. While Bluetooth (at 2.4Ghz) is a much lower frequency rate than IR (300 Ghz)--thus making your question quite puzzling--the ability of the hardware to actually make use of that carrier frequency is much more prone to error and in fact the high carrier rate is simply too high to be able to capture easily. As a result, the implementation of the communications overlays a rather low multiplied carrier rate to make up for it, and it ends up being much slower.

 

To see this, try putting a running hair dryer in front of that IR transmitter or receptor....

 

I believe in getting into hot water; it keeps you clean, :P

Buffy

Link to comment
Share on other sites

To see this, try putting a running hair dryer in front of that IR transmitter or receptor....

...as stated in the OP...

Interference plays a major role. Placing a hot cup of coffee within visible range of two IR ports plays havoc on data integrity, and causes packets to be continually resent, slowing things down badly.

I personally think that can be sorted by placing polarised filters (polarised for that particular frequency) over the receptor lenses can sort it out.

 

But the essence of IR transmission is for the transmitting diode to change temperature. And the heating up/cooling down cycle between sending two bits of data must have an influence on speed. And I think that is the main bottleneck when it comes to IR transmission.

 

All other issues regarding IR usage (line of sight, interference etc.) are merely inconveniences, but will not have an impact on transmission speed.

 

What say you?

Link to comment
Share on other sites

But the essence of IR transmission is for the transmitting diode to change temperature. And the heating up/cooling down cycle between sending two bits of data must have an influence on speed. And I think that is the main bottleneck when it comes to IR transmission.
Yah! Said that! ;)
the ability of the hardware to actually make use of that carrier frequency is much more prone to error and in fact the high carrier rate is simply too high to be able to capture easily. As a result, the implementation of the communications overlays a rather low multiplied carrier rate to make up for it, and it ends up being much slower...
The hardware is totally sucky. Media transition are always problematic (optical-electrical suffers this one badly), and the hardware to do the transition is usually far less efficent than the media on either side of the connection. This is especially true in IR where--as you say--getting the transmitter to heat up and cool down is the electronic equivalent to a chipmunk-powered coffee pot.

 

But,

I personally think that can be sorted by placing polarised filters (polarised for that particular frequency) over the receptor lenses can sort it out.
Really? how so? Ever look through one of those goofy night-vision thingys? Its all just green to me. Polarized lenses work with visible light because they're relatively coherent anyway, I'm pretty sure IR radiation is a lot less so and would not be helped by such lenses. But now I really don't know what I'm talking about...
All other issues regarding IR usage (line of sight, interference etc.) are merely inconveniences, but will not have an impact on transmission speed.
Sure, but small annoyances in situations where there is a close alternative can be a deal killer. Rotary phones are not really that much harder to use compared to the touch-tone, but can you find one? (oh gosh, the youngsters will have no idea what "rotary" is... :eek_big: )

 

When you innovate, you've got to be prepared for everyone telling you you're nuts, :doh:

Buffy

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...