FOA Guide to Fiber Optics


The Fiber Optic Association - Tech Topics


 
Frequently Asked Questions (The FAQs)
 
We get questions all the time, so we have started putting those we feel have general interest on this web page. As it grows larger, we'll index it for convenience. Here's more testing FAQs too.
 
Looking for more information? Try the FOA Online Reference Guide or the Tech Topics index
 

 
The Questions We Get Most Often:
 
Can you explain to me in simple terms what the difference in fiber optics and traditional copper cables?
We get this question so often, and it is such a complex issue, we created a whole web page to answer it!
 
Do signals really travel faster in fiber optics?
You know that "sending communications at the speed of light" means the speed of light in glass (about 2/3 C), but you might be surprised to know that signals in UTP (unshielded twisted pair) cables like Cat 5e travel at about the same speed (2/3 C). Coax, meanwhile, has a faster NVP (nominal velocity of propogation), about 0.9C, due to it's design. Fiber's "speed" is not referring to the speed of the signal in the fiber, but the bandwidth potential of the fiber.
 
Can you give me a definition of structured cabling?
"Structured Cabling" refers to a standardized cabling architecture, specified by EIA/TIA 568 in the US and ISO 11801 internationally. It uses twisted pair and fiber optic cables to create a standardized cabling system designed for telephones and LANs built by many manufacturers. The nomenclature here is even less precise. Vendors also refer to this as "structured cabling", data-voice cabling, low-voltage cabling and limited-energy cabling.

What is a better way of communication,wire or fiber optics?
The telcos and CATV companies use fiber optics because of economics. It's greater bandwidth and lower attenuation allow longer distances and more channels (voice or video) per fiber pair. Typical fiber specs are more than 100 times farther and 1000+ times faster! Try this for some more reading.
I'm a tech coordinator at a k-12 school district. Depending on what you read and who you talk too and which way the wind is blowing, the decision on whether to wire CAT-5 or fiber is a toss-up? In your opinion, which is more cost-effective for a building that is trying to set up a solid switched backbone that will be usable for 5-10 years?
I've headed a school tech committee myself and work with local schools (had kids in school locally - both now in college) and they're asking the same question. Many are looking very seriously at fiber.
Many schools are now wired or wiring with a fiber optic backbone and Cat 5 to the desktop. That will support Fast Ethernet, which will probably be good for 5 years. It requires local hubs (which require conditioned power and UPSes, and the total cost is probably more than an all fiber network today (or equal).
WIth fiber, you get more potential bandwidth, but you also get rid of the hubs (and all their additional needs, like power, space, management, etc,) You can buy 100Base-SX NIC cards for jsut over $100 and inexpensive hubs in fiber from several companies. including Gemflex (www.gemflex.com ) which uses the new low cost 3M Volition connectors which save $$ on the cable plant. And you can upgrade at least to GB Ethernet.
With copper, I'm afraid you buy into a cable of the year club. The industry has already written off Cat 5, has been selling Cat 5E (extended) which has finally become a standard, now is pushing Cat 6A (a year or more away from being standard) and talking about Cat 7!
The FOA Online Reference Guide has a complete section on cabling, fiber and copper, plus wireless.  
 
 

On General Fiber Optics:
 

What are some of the uses of fiber optic cabling in the business world?
The biggest use is telephony, followed by CATV, then LAN backbones, connecting hubs. Next is connecting remote video cameras for security systems. The building management and security systems are switching to fiber in many buildings due to distance and EMI requirements. Fiber is not often used to the desk because it is perceived to be too expensive, but it allows a system without wiring closets, making the cost less in most instances. Gigabit Ethernet will drive even more fiber into networks, since UTP applications will be too difficult to install.


 

Will "intelligent buildings" use fiber optics or copper wiring to carry voice/data/video throughout the structure?
Both. Fiber will be used when the distances are longer than 90 meters or data rates are higher (e.g. Gigabit Ethernet). Most backbones will be fiber. Desktop connections to telecom closets will be copper for the near future, until network managers find out what a telecom closet really costs! Phones will continue to use copper until we all go to voice over IP. Video (CCTV) uses fiber for distances over about 150-250 meters.

Can you please tell me what the difference between, dB and dBm when you are trying to test fiber optic cable.
Fiber optic power measurements are generally made in a log scale of "decibells" or "dB" (actually named after Alexander Graham Bell) that has a scale of 10 dB for every factor of 10 in power. The equation is actually:
 
dB=10 log (power 1/power 2)
 
dB is therefore a ratio measurement - 10 times more power is +10 dB and 100 times less is -20 dB, etc.
For ABSOLUTE measurements, you must have a reference point. If we use 1 milliwatt of power as our reference, our equation becomes
 
dB= 10 log (power/1 mW)
 
So now 1 mW is 0 dB, 10 mW is 10 dB, 0.1 mW is -10 dB, etc.
Here's more information on testing including dB too.
 
I need to be able to measure the "true" or "useful" power of a VCSEL Laser Diode. To do this, I believe I need to measure the Peak-to-Peak power (the extinction ratio?).
All FO power meters measure average power. This is simply peak power diluted by duty cycle. If you know the duty cycle of the signal and the average power, you can calculate peak power as (Avg pwr/duty cycle). If you are measuring a signal with a clock of 50% duty cycle (1-0-1-0, etc), the meter will read half the peak power. Most high speed networks are sending random data, so the duty cycle can probably be assumed to be 50%. Many systems have a test mode that transmits 50% duty cycle just for optical testing.
 
I am confused by the resolution choices in fiber optic power meters. If accuracy is +/-0.2dB, then it doesn't seem worthwhile to have resolution out to the hundredths & thousandths of a dB, when accuracy is only in the tenths (one decimal place).
Resolution for power meters is an interesting subject. If you measure power, and the measurement is accurate as (or as NIST prefers - has a measurement uncertainty of ) 0.2 dB, a measurement of 0.00 dB +/- 0.2 dB is confusing. The 1/100 th dB resolution is in fact meaningless. If the uncertainty was 0.02 dB, a hundreth resolution would make sense. Now remember we are talking "absolute power" measurements, calibrated relative to NIST standards.
If we are looking at loss measurements, things changed considerably. The loss of a LC connector, about 0.1 dB, is measured relatively, eg. -15.00 dBm to -15.10, and the measurement uncertainty now has nothing to do with the absolute power levels, but the RELATIVE difference between the two readings. That difference is as precise as the linearity of the power meter ( better than 0.01 dB) and the uncertainty of the mating of the connectors ( a few hundreths too?). So you certainly want a meter with 0.01 dB resolution to test connectors!
If you are testing an installed cable plant with say 3-10 dB loss, the uncertainty is probably 0.5 dB, so 0.1 dB is adequate. If it's a long haul network with 30 dB loss, the uncertainty can be over 1 dB, so 0.1 dB is much more than adequate.
 
How do you classify fiber optic cable?
Broad question:
By NEC - UL - flame retardancy
By cable types (tight buffer/distribution/breakout/loose tube)
By fiber types (multimode/singlemode/hybrid)
Whether it has fiber and wire (composite cable)
See the cable section of our Online Reference Guide or  Lennie Lightwave for more information.
 
What type of fiber is required to run at gigabit speed?
Depends on how far you want to go. Plain old FDDI fiber (160 MHz-km bandwidth @ 850 nm and 500 MHz-km @ 1300 nm ) will go ~240 m with a 850 VCSEL or 500 m with a 1300 laser. Practically every fiber manufacturer has 50/125 laser-optimized premium fiber (OM2/OM3/OM4) that will go a lot further -as far as 2 km - and while it's more expensive, we recommend it for any backbone applications.


I am just starting to plan for a fiber installation for our LAN. I am confused as to what cable to buy. Is it determined by our network hubs? First, every fiber link uses two fibers, each transmitting in opposite directions, so you need at least two fibers. Most of these hubs in premises applications use multimode fiber of the standard 62.5/125 micron (core/cladding) dimensions. Most users specify "FDDI Grade" fiber, although there is some new higher bandwidth fiber available that
is better if you expect to use it for Gigabit Ethernet someday. At a minimum, you should use a two fiber "zip cord" with a jacket UL rated for flame retardancy appropriate to the installation. If it is in air handling areas above ceilings, it should be plenum rated, otherwise Riser or General ratings will be OK. Most installs put in larger fiber count cables since the cost of installing is higher than the cost of cable. Extra fibers in a cable are like memory in computers - cheap and put in all you can justify! THis is especially important if you are installing between locations that have lots of equipment that may use the fiber in the future. For larger fiber count cables, use breakout or distribution cable types - see Lennie Lightwave for descriptions of the cable types. Connectors should be chosen to match the connectors on the equipment (usually ST, LC or SC.) A zip cord could be installed with connectors on
each end ready to go into the equipment. Mutlifiber cables would be terminated in rack mounted patch panels or wall mounted boxes and connected with patchcords.

 
I have a client who wants to connect his home computer to his office computer with a direct link. The distance between buildings is approx. 500 yards. I have told him he will need to use fiber cable between the buildings due to the long distance.
Buy a couple of FO media converters and some fiber optic cable (needs to be outside plant cable which is waterproof.) The cable should be multimode (62.5/125 or 50/125 micron size). Install the cable and terminate it. You can get wall plugs for fiber optics from many sources including many local distributors and computer stores. Attach the fiber cable to media converters. Connect one computer to a media converter with a regular Cat 5 cable and the other to a media converter with a Cat 5 CROSSOVER cable. Plug and play!
 
I am running a secure fiber optic cable through a conduit is there special tools to pull fiber optic cable vs. copper cable, if so where can i find these tools.
Fiber pulling gear is similar to all cable pulling gear, with one big caveat: ALL fiber optic cables must be pulled by the strength members, which are usually kevlar fibers. You should not simply put a kellums grip on the jacket and pull - that will ruin most cables.
What is the difference between indoor and outside cables? Generally, outside cables are designed to resist water penetration by using a gell fill or dry water-blocking compounds and a polyethelene jacket. The new dry cables are getting very popular, since they can be made as distribution types which are easier to terminate. Many also have a PE jacket over a UL-rated PVC jacket so you can bring the cable into the building, strip off the outside jacket and run it anywhere in the building (not the 50 feet limit of PE.) Indoor or premises cable must be rated for flame retardance for safety and to meet code.
 
When is it viable to string fiber cable overhead? There are two solutions: self-supporting aerial cable or regular cable lashed to a messenger (maybe even the old telephone wire!) Most cablers can help you with a suggestion of the proper cable types.
How soon will it be, until we are able to communicate via the telephone/internet/television by means of fiber optics? And how many fibre optics would be required for a small town?
The answer to this question is complicated. The Internet is all fiber optics today, as is most of the phone and CATV systems. It's only the final connections to the home that is still copper and activity in that area is very high in 2005. The telephone companies have been pushing DSL, but it is a flawed concept - bandwidth is heavily dependent on the length of the lines, so generally it's not much better than a telephone modem. CATV companies are happy with coax cable, as it has
gigibit capability. Both complain fiber to the home is too expensive, but the alternatives are not many, forcing the issue from a competitive standpoint!
As to how much fiber is needed, that depends on the system used. Two fibers to the home are probably adequate (one transmits, the other receives.) Backbone cables are usually 72-288 fibers, since it is more economical to install large fiber count cables now and leave them dark. Several techniques exist to multiplex signals on the fibers, including frequency-division, time-division and wavelength-division multiplexing, so one pair of backbone fibers can serve thousands of connections. No one answer here!
 
I need to connect an IBM 3174 controller (coax or balun to RJ45) to single mode fiber. Then connect the other end of the fiber to a printer (coax again). Local printers work easy with coax to balun to RJ45...then CAT 5 copper to balun to coax.
Whatever you need to convert to fiber, you can get a media converter somewhere.
 
We have a small network with a single server driving about 45 desktop pcs. 8 of the pcs are at the far end of the shop from the server. (approx 300 ft) The shop is filled with cnc equipment generating lots of "noise" I was thinking to come from one of the 10/100 hubs (3com office connect) at the server, through a cat5-fiber converter along a fiber line to a fiber-cat5 converter, into a 8 port, 10/100 hub and cat5 down to the pcs'. The pcs' don't need much bandwidth, just the cable length and interference problem. What do you two think?
Good idea. A pair of media converters is only about $2-300 and the cable would be inexpensive.
 
We currently have a central computer room with several distribution closets (11) over 3 floors. We have fibre backbone to some of the rooms (18 multimode fibres) but some of the rooms do not have any fibre. We have a quote which includes running both multi-mode and single-mode fibre to these rooms. What benefit or purpose is there to the single mode, I thought this was only for excessive distances and probably not necessary "in-doors?" Are hubs and switches now available that use single-mode rather than multimode?
Most electronics are available for both multimode and singlemode fiber, but today singlemode is usually for long distances. However, singlemode fiber is so inexpensive, that adding some singlemode fiber to every cable you install can make sense and someday you may need its bandwidth. (Remember 10 GB Ethernet is coming!)
I have been recommending installing hybrid (MM+SM) cables for ten years as a hedge on future applications. Don't terminate it now, but have it available for future networks that may require it. Instead of a 18 fiber MM cable, price a 24 fiber cable with 18 MM and 6 SM. You will find the cost differential quite small and if it saves installing another cable in the future, it will have enormous payback!
 

What is modal bandwidth, and how does it effect what distances gigabit ethernet can travel over fiber? Modal bandwidth is caused by the fact that light in multimode fiber travels in rays or "modes" that take different times to get to through the fiber, causing dispersion. The longer the fiber, the greater the effect. This is a major factor in the distance limitation of GBE and the incentive for fiber manufacturers to develop better multimode fiber.
While the worst case distance for 62.5/125 FDDI-spec fiber using a 850 nm VCSEL source is only 220 m, laser-optimized 50/125 fiber capable of 1 km is now available.

How can a coherent laser single mode source be correctly coupled to a long length of multimode fibre and multimode receiver ? This is a difficult issue to address. Coherent lasers and multimode fiber have always had problems - that's why the telcos switched to SM fiber. The problem is the SM cable launches into the center of the MM fiber and you get differential modal dispersion (DMD) problems. The center of the MM fiber is somewhat unpredictable. Plus every connector and splice changes the modal distribution adding to the uncertainty. For GBE, companies are offering offset launch cables that couple the SM fiber with an offset of about 15 microns. That seems to help. Laser-optimized 50/125 fiber should be a good solution. If you signal is analog, I don't think you will find a good solution, due to the distortion induced. If it's digital, you will have more luck, but it's still hard to predict!
 
I am distributing Satellite signal via single mode fiber. Can I patch into my multi-mode fiber data distribution system somehow using an adapter of some kind?
The connection from singlemode to multimode fiber is OK - the big multimode fiber catches all the light, but the other way is a problem - MM to SM gives 16-20 dB loss. Connectors (esp APC) must be matched. You can get adapters called "media converters" that will convert from singlemode to multimode and vice versa, within certain bandwidth limitations.

Do you see any real serious problems in splicing together fibre cables from different manufacturers, as long as the cable is manufactured to the same specifications? No, not as long as they are the same type and size, eg multimode 62.5/125 or 50/125 and singlemode should be "normal" (non-dispersion shifted) or dispersion shifted. Some singlemode fibers are made for 1300 nm only, 1550 nm only or both, and they should not be mixed. There are some other singlemode fibers that have special coatings that cannot be mixed with others. For singlemode, ask your fiber vendors, splicer supplier or try it first before going into the field!
 
What is the difference in connectorization in tight buffered and loose tube type of cables?
A tight buffer cable can be terminated directly. THe 900 micron coating on the fiber is rugged enough to allow the connnector to be connected directly and if there is a 3 mm jacket, it is crimped to the connector for strength.
A loose tube cable has 250 micron buffer on the fiber in it and is too fragile to attach a connector directly. It has be be used with a breakout kit that sleeves the fiber in a protective tube before termination.
 
What is the distance Limitation on 100Mbps Fiber for Single Mode and Multimode. Also what are the sizes of fiber for each distance limitation.
Multimode 100 Mb/s (FDDI or Fast Ethernet) is limited to 2 km (1.2 miles) by the dispersion of the fiber and the chromatic dispersion limits of the LED in the fiber. This holds for both 62.5/125 micron fiber and 50/125 which exists in some older installations (and is being used for gigabit networks now, perhaps in error.) Singlemode links are avaialble for over 20 km (12 miles) or more. All singlemode fiber for 1310 nm is the same, with a core diameter of about 8-9 microns and an cladding diameter of 125 microns.
 
Can I splice 62.5/125 fiber to 50/125 fiber? If so what type of nominal loss would I be looking at at for my loss budget?
If you splice it, you will get directional losses. Transmitting from 50 to 62.5 fiber, you'll get virtually no losses but from 62.5 to 50, you will get a minimum of 1.6-1.9 dB loss due to the size and NA mismatch. (50 micron fiber has a lower numerical aperture (NA) than 62.5.) See The Fiber Optic Technicians Manual, Chapter 17 for a table of interconnection losses with different size fibers.
We have some basic R&D data on this topic, posted in this article connecting 50/125 to 62.5/125 fiber.
 
Can you tell me what the unrepeatered length limitations are for 62.5/125 multimode fiber for:
10BaseFX :
12.5 dB loss @ 850 nm, could equal 4 km
100BaseFX: 2 km, bandwidth, not loss limited, depending on LED source
GBit Ethernet: 220 m @ 850 nm (bandwidth limited, and this distance is likely to get longer) or 500 m at 1300 nm with a singlemode pigtailled laser.
These are on FDDI grade 62.5/125 multimode fiber, which has a bandwidth of 160 MH-km at 850 nm and 500 MHz-km at 1300 nm. 62.5/125 fiber with 500 MHz-km bw at 850 nm and 50/125 laser-optimized fibers are available, specifically designed for premises applications of GBE. 
Here is a page of specs on most current and legacy fiber optic systems.  
 
 
I am wondering what the proper pulling procedure is for premises type fiber cable. I have been told to use the kevlar to pull with, and I have also been told that pulling the cable by the kevlar can be damaging to the fibers. What do you recommend?
The cable has been designed to be pulled by the Kevlar and ONLY by the Kevlar. One of our demos in our Fiber U classes is to have students pull a zipcord hard by the jacket and see how it destroys the cable. Unless the cable has been specifically designed to pull by the jacket (which is usually a double jacket design with Kevlar between the
layers,) you must strip back the jacket, cut off the fibers, expose the Kevlar, tie it off and use it to pull the cable.
Kevlar, by the way, is a duPont trade name for aramid fibers.
Here is more information on installing fiber optic cable.
 
Does this still hold true with six fiber or 12 fiber premise style cable?
Absolutely. Remove the jacket, cut off the fibers and central strength member and tie a swivel onto the kevlar. You'll need about 6 inches of kevlar, tie with a double knot on the swivel then tape the kevlar back along the cable, esp. covering the end of the jacket, to prevent the kevlar from pulling loose or the cable snagging while pulling.
 
I am interested in learning how to terminate fiber optic cable.
Take the free "Virtual hands-on" tutorial on Lennie Lightwave for hands-on details.
Here is more information on terminating fiber optic cable.

I have been informed by a company that they now have a small form fibre optic connector (LC) which has reduced loss to cater for the demands of Gigabit Ethernet along with other benefits. The connector has not been ratified by any International Standards body yet. Also, they inform me that the connector has been included into the IBM active equipment product range as well as their own and another hub suppliers range of products (not one of the big players).
As there may be numerous cross connections to interlink hubs, this may exceed the Db loss allowed for Gigabit Ethernet. The LC connector with a lower loss would allow for a greater number of cross connects.
What is your view on using proprietary fibre optic connectors.
There is little risk in using the LC connector. Of all the SFF (small form factor connectors) it is the one that has become the most popular - in fact, it is the de facto standard connector for gigabit and 10 gigabit networks. Indeed the design is very well thought out. The smaller ferrule is easy to polish well and has excellent mating performance - which leads to low loss and back reflection. It is also easy to terminante and test.
 
 
1. Is there any independent testing procedures on "Rodent Proof" optical cables and what sort of qualification/s being used to justify?
There is a test referenced in GR-20 issue 2. I have heard that there are facilities which still perform the test, but I am not familiar with any of them. I know of no-one who is trying to justify the test.
2. What measurements are manufacturers using when they claimed that their products are of "rodent resistance", "rodent proof" and "rodent protection".
The test referenced above did not use any of these terms, but just gave a damage rating. There is anecdotal information that some cable / duct types have less rodent damage in the field than other cable / duct types.

3. How are these three terms differentiate in terms of classification with respect to the rodent issue"?
There is no organization, of which I am aware, which gives any formal definition to the terms.
Rodent resistance" and "rodent protection" are currently used by some organizations, for cable designs which they feel have experienced less damage in the field. "Rodent proof" , to the best of my knowledge, has only been used by a company which makes ductwork .
 
 
Can you give me a definition of Fiber optics?
What we call "fiber optics" is communications by modulated light guided through a transparent optical fiber. As a realtively young technology, the nomenclature can be quite varied among users. In the UK, it's fibre optics, sometimes its fiberoptics or fibreoptics (as one word). Within the business, we generally say "fiber" when we refer to the optical fiber itself, although some use it to mean a cable of optical fiber. Lennie Lightwave has a fiber optic glossary on the web.

What type of components are others using to facilitate fiber reel acceptance testing? I am looking for an inexpensive, fast and reliable method to use with an OTDR or power meter.
The normal way to test fiber on the reel is:
1. Physical examination of the reel and content. Any sign of physical damage means you should test very carefully to insure the cable or fiber has not been damaged. You can also read the distance numbers off the cable to determine the length of cable on the reel.
2. Continuity test using a visible light source. MM can use a flashlight fiber tracer or a 650 nm (red) LED source. Cleave the fiber and use a bare fiber adapter or unterminated connector on the end of the fiber. SM fiber should use a visible laser fault locator .
3. If you really need to do a loss test (usually a contractual issue, not a technical issue, as the cable was tested thoroughly at the manufacturer before shipment) you can do a cutback test with a meter and source or a OTDR test.
3a. Source and meter: Use a launch cable with the source and a bare fiber adapter on the fiber off the reel to connect to the fiber. Measure the power at the far end of the cable again with a bare fiber adapter, then come back to the source end and cut off the fiber on the cable about 2-3 feet from the launch cable connection. Measure the power there - without touching the connection of the fiber under test to the launch cable ! The difference in the power measurements is the loss of the cable. Divide by the length to get the attenuation in dB/km.
3b. OTDR: Use a long launch cable on the OTDR and a bare fiber adapter on the fiber to test. Using a little index matching fluid will reduce the reflection and loss of the connection. Read the signature for attenuation and any localized stress losses.
Other references: The Fiber Optic Technicians Manual has a chapter (17) on testing and Lennie Lightwave has a section on testing and a full article on OTDRs.
 
Please tell me how an optical power meter differs from an OTDR.
A fiber optic power meter measures the amount of power coming out of a fiber, just like a voltmeter measures voltage. See FOTP- 95, the standard test procedure for power measurement . When used with a test source, it can be used to measure the end-to-end loss of a fiber optic cable or installed cable plant. FOTP-171, OFSTP-14 and OFSTP-7 cover this test .
OTDRs work like "optical radar" to find faults in cables, measure length or test loss of splices in cables. They are used for troubleshooting or for documentation of outside plant cables that have splices (splicing is not often used in premises cabling.) 
 Here is a more detailed explanation of all the options in cable testing. See this page for more info on OTDRs.
 
What should the length of the launch cable be from the source to the cable being tested? Also I need a receive cable from the meter end to the connector. should it be the same length?
Any cable 1-5 meters length works fine. 
 
What do I need for connecting Optic Fibre Cable to a Cat 5 Cable?
You need a device called a "media converter" available from a number of companies for $100-200.
 
Is it possible to send a forward and reverse signal along the same fiber?
WDM is only for sending two frequencies on the fiber in the same direction...right?
Nope, the direction is unimportant. FTTH PONs use this technique to send signals both ways over one fiber. You can send upstream on one wavelength and downstream on the other. It's been done for years using 1300 and 1550 nm on singlemode or 850 and 1300 nm on multimode. For example you could use a 1550 transmitter with a fiber amp to broadcast out to numerous locations then use 1300 coming back upstream with 1300/1550 WDMs.
 
In fiber what is a "mode" actually. "Multi" mode seems to imply they are discrete separate paths. I see no way discontinuities occur in glass. Is it a frequency thing? Actually multimode fiber is made up of many discrete layers of glass to create the graded index profile. There may be 150-2000 layers in the core. Modes are the equivalent of "standing waves" in a fiber - paths for modes are reinforced by electromagnetic fields. You can actually see the effects of modes by transmitting a coherent source (laser) down a multimode fiber. You see a "speckle pattern" which is the result of interference between discrete modes. A good explanation is in Jeff Hecht's book Understanding Fiber Optics.
 
Will a single mode connector work on multi-mode cable?
The answer is maybe you can use SM connectors on MM but NOT the reverse. SM connectors are made to tighter tolerances - as is SM fiber - so the ferrule hole may be too small for some MM fibers. MM connectors have bigger holes for the fiber and will have high loss (>1dB) with SM. Also MM connectors may not be PC (physical contact) polish - terrible for return loss. MM fiber may not fit the smaller hole in SM connectors.
 
Could you please explain to me what optical return loss is?
It's simply a reflection at a connector or splice caused by imperfect mating of the fibers (reflectance) although some companies define it to be measurement of a cable plant that includes backscattered light as well as reflectance from individual components. The reflected light may cause a laser transmitter to have problems with linearity or create background noise that affects transmission. It does not affect LED systems.
SM cabling in a premises environment can have big ORL problems, as the short cables allow multiple reflections that cause optical "background noise." A good singlemode ORL is 30-40 dB for PC connectors, 40-50 for "Super PC" and >50 for APC connectors.
Multimode Gigabit systems are now concerned with ORL (they use VCSELS) and specify something better than -20 to -30 dB depending on the system.
Here is more information on Reflectance and optical return loss.  
 
If you have a 50 micron fiber backbone, can you use 62.5 fiber jumpers on each end?
NO! On the receiver end it is OK, but on the transmitter end, the larger core of 62.5 into smaller 50 micron fiber will have losses of 2-4 dB.

If you have 62.5 fiber backbone, can you use 50 micron patch cords?
NO! Same as above, except the excess loss is at the receiver end. In both cases, the losses depend on the modal distribution in the fiber, a result of the source output and the number of connections. Information on mismatched fibers is here.
 
If accuracy of a power meter is +/-0.2dB, then it doesn't seem worthwhile to have resolution out to the hundredths & thousandths of a dB, when accuracy is only in the tenths (one decimal place).
Resolution for power meters is an interesting subject. Consider what happened when the first handheld 8 digit calculators became available in the early 70s. People would divide a two digit number by a two digit number and report the results to 8 digits! Of course, the precision of the answer was still two digits, not this "calculator precision."
If you measure power, and the measurement is accurate as(or as NIST prefers - has a measurement uncertainty of ) 0.2 dB, a measurement of 0.00 dB +/- 0.2 dB is confusing. The 1/100 th dB resolution is in fact meaningless. If the uncertainty was 0.02 dB, a hundreth resolution would make sense.
Now remember we are talking "absolute power" measurements, relative to NIST standards. If we are looking at loss measurements, things changed considerably. The loss of a LC connector, about 0.1 dB, is measured relatively, eg. -15.00 dBm to -15.10, and the measurement uncertainty now has nothing to do with the absolute power levels, but the RELATIVE difference between the two readings. That difference is as precise as the linearity of the power meter ( better than 0.01 dB) and the uncertainty of the mating of the connectors ( a few hundreths too?). So you certainly want a meter with 0.01 dB resolution to test connectors!
If you are testing an installed cable plant with say 3-10 dB loss, the uncertainty is probably 0.5 dB, so 0.1 dB is adequate. If it's a long haul network with 30 dB loss, the uncertainty can be over 1 dB, so 0.1 dB is much more than adequate.
 
Why do some power meters have calibration at 1300 nm while others are 1310 nm?
Convention. The "official" laser center wavelength is 1310 nm, but vary between 1290 and 1330 nm. LEDs are broad spectral output devices that have outputs over a broad range of wavelengths, roughly centered around 1300 nm. We prefer to just say 1300 nm, and so does NIST, who calibrate at this wavelength with a 1300 nm YAG laser.
 
I am a Safety officer involved in the construction business for a major Manufacturuer. Our own Construction forces are now installing and connecting fiber optic cable. I am having difficulting finding any information on safety procedures for this activity. Any information you have regarding safety and disposal of "waste ends" would be appreciated.
We have added a page on safety to the FOA website and all our books and reference guide covers safety.
 
Where do I find the best information on fiber optics for lighting?
We're not into fiber optic lighting but we have a tutorial on lighting on our website.


 

 

(C) 2002-11, The Fiber Optic Association, Inc.


Return To The FOA Home Page

Return To FOA Tech Topics