Fiberopticvideos.com
Welcome
Login / Register

Most Popular Articles


  • Why Does FTTH Develop So Rapidly?

    FTTH (Fiber to the Home) is a form of fiber optic communication delivery in which the optical fiber reached the end users home or office space from the local exchange (service provider). FTTH was first introduced in 1999 and Japan was the first country to launch a major FTTH program. Now the deployment of  FTTH is increasing rapidly. There are more than 100 million consumers use direct fiber optic connections worldwide. Why does FTTH develop so rapidly?

    FTTH is a reliable and efficient technology which holds many advantages such as high bandwidth, low cost, fast speed and so on. This is why it is so popular with people and develops so rapidly. Now, let’s take a look at its advantages in the following.

    FTTH

    • The most important benefit to FTTH is that it delivers high bandwidth and is a reliable and efficient technology. In a network, bandwidth is the ability to carry information. The more bandwidth, the more information can be carried in a given amount of time. Experts from FTTH Council say that FTTH is the only technology to meet consumers’ high bandwidth demands.
    • Even though FTTH can provide the greatly enhanced bandwidth, the cost is not very high. According to the FTTH Council, cable companies spent $84 billion to pass almost 100 million households a decade ago with lower bandwidth and lower reliability. But it costs much less in today’s dollars to wire these households with FTTH technology.
    • FTTH can provide faster connection speeds and larger carrying capacity than twisted pair conductors. For example, a single copper pair conductor can only carry six phone calls, while a single Fiber pair can carry more than 2.5 million phone calls simultaneously. More and more companies from different business areas are installing it in thousands of locations all over the world.
    • FTTH is also the only technology that can handle the futuristic internet uses when 3D “holographic” high-definition television and games (products already in use in industry, and on the drawing boards at big consumer electronics firms) will be in everyday use in households around the world. Think 20 to 30 Gigabits per second in a decade. No current technologies can reach this purpose.
    • The FTTH broadband connection will bring about the creation of new products as they open new possibilities for data transmission rate. Just as some items that now may seem very common were not even on the drawing board 5 or 10 years ago, such as mobile video, iPods, HDTV, telemedicine, remote pet monitoring and thousands of other products. FTTH broadband connections will inspire new products and services and could open entire new sectors in the business world, experts at the FTTH Council say.
    • FTTH broadband connections will also allow consumers to “bundle” their communications services. For example, a consumer could receive telephone, video, audio, television and just about any other kind of digital data stream using a simple FTTH broadband connection. This arrangement would more cost-effective and simpler than receiving those services via different lines.

    As the demand for broadband capacity continues to grow, it’s likely governments and private developers will do more to bring FTTH broadband connections to more homes. According to a report, Asian countries tend to outpace the rest of the world in FTTH market penetration. Because governments of Asia Pacific countries have made FTTH broadband connections an important strategic consideration in building their infrastructure. South Korea, one of Asian countries, is a world leader with more than 31 percent of its households boasting FTTH broadband connections. Other countries like Japan, the United States, and some western countries are also building their FTTH broadband connections network largely. It’s an inevitable trend that FTTH will continue to grow worldwide.

    Read more »
  • Basics of Fiber Optic Splicing

    Fiber Optics Splicing is becoming  more and more a common skill requirement for cabling technicians. A fiber optic splice is defined by the fact that it gives a permanent or relatively permanent connection between two fiber optic cables. Fiber optic cables might have to be spliced together for a number of reasons—for example, to create a link of a particular length, or to repair a broken cable or connection. As fiber optic cables are generally only manufactured in lengths up to about 5 km, when lengths of 10 km are required, for example, then it is necessary to splice two lengths together to make a permanent connection.

    Classification of Techniques Used for Optical Fiber Splicing

    Mechanical splices
    The mechanical splices are normally used when splices need to be made quickly and easily. Mechanical fiber optic splices can take as little as five minutes to make, although the level of light loss is around ten percent. However this level of better than that which can be obtained using a connector. Some of the sleeves for mechanical fibre optic splices are advertised as allowing connection and disconnection. In this way a mechanical splice may be used in applications where the splice may be less permanent.

     

    Fusion splices
    This type of connection is made by fusing or melting the two ends together. This type of splice uses an electric arc to weld two fiber optic cables together and it requires specialised equipment to perform the splice. Fusion splices offer a lower level of loss and a high degree of permanence. However they require the use of the expensive fusion splicing equipment.

    Mechanisms of Light Loss at Optical Fiber Joint

    When joining optical fibers, the opposed cores must be properly aligned. Optical fiber splice loss occurs mostly in the following manner.

    Poor concentricity
    Poor concentricity of joined optical fibers causes a splice loss. In the case of general purpose single-mode fibers, the value of splice loss is calculated roughly as the square of the amount of misalignment multiplied by 0.2. (For example, if the light source wavelength is 1310 nm, misalignment by 1 µm results in approximately 0.2 dB of loss.)

    Poor concentricity
    Axial run-out
    A splice loss occurs due to an axial run-out between the light axes of optical fibers to be joined. For example, it is necessary to avoid an increased angle at fiber cut end when using an optical fiber cleaver before fusion splicing, since such an angle can result in splicing of optical fibers with run-out.

    Axial run-out
    Gap
    An end gap between optical fibers causes a splice loss. For example, if optical fiber end faces are not correctly butt-joined in mechanical splicing, a splice loss.
     
    An end gap between optical fibers
    Reflection
    An end gap between optical fibers results in 0.6 dB of return loss at the maximum due to the change in refractive index from the optical fiber to the air. In addition, the whole optical fiber ends should be cleaned because loss can also occur due to dirt between optical fiber ends.

    Classification and Principles of Fusion Splices

    Fusion splicing is classified into the two methods, as follows:

    Core alignment

    Optical fiber cores observed with a microscope are positioned with the help of image processing so that they are concentrically aligned. Then, an electric arc is applied to the fiber cores. The fusion splicer used has cameras for observation and positioning in two directions.

    Fs core_alignment.jpg

    Stationary V-groove alignment

    This fusion splicing method uses V-grooves produced with high precision to position and orient optical fibers and utilizes the surface tension of melted optical fibers for alignment effects (cladding alignment). Splices made by this method achieve low loss thanks to the recent advancement of optical fiber production technology, which has improved the dimensional accuracy regarding the placement of core. This method is primarily used for splicing a multi-fiber cable in a single action.

    Fs V-groove.jpg
     

    Tips for Better Splices:

    1. Thoroughly and frequently clean your splicing tools. When working with fiber, keep in mind that particles not visible to the naked eye could cause tremendous problems when working with fiber optics. "Excessive" cleaning of your fiber and tools will save you time and money down the road.
     
    2. Properly maintain and operate your cleaver. The cleaver is your most valuable tool in fiber splicing. Within mechanical splicing you need the proper angle to insure proper end faces or too much light escaping into the air gaps between the two fibers will occur. The index matching gel will eliminate most of the light escape but cannot overcome a low quality cleave. You should expect to spend around $200 to $1,000 for a good quality cleaver suitable for mechanical splicing.
     
    For Fusion splicing, you need an even more precise cleaver to achieve the exceptional low loss (0.05 dB and less). If you have a poor cleave the fiber ends might not melt together properly causing light loss and high reflection problems. Expect to pay $1,000 to $4,000 for a good cleaver to handle the precision required for fusion splicing. Maintaining your cleaver by following manufacturer instructions for cleaning as well as using the tool properly will provide you with a long lasting piece of equipment and ensuring the job is done right the first time.
     
    3. Fusion parameters must be adjusted minimally and methodically (fusion splicing only). If you start changing the fusion parameters on the splicer as soon as there is a hint of a problem you might lose your desired setting. Dirty equipment should be your first check and them continue with the parameters. Fusion time and fusion current are the two key factors for splicing. Different variables of these two factors can produce the same splice results. High time and low current result in the same outcome as high current and low time. Make sure to change one variable at a time and keep checking until you have found the right fusion parameters for your fiber type.
    Read more »
  • T-Mobile becomes number one US smartphone channel

    Written by Scott Bicheno  Telecoms.com

    T-Mobile

    Disruptive US operator T-Mobile has become the leading sales channel for smartphones in the US, according to new research from Counterpoint.

    T-Mobile overtook Verizon to take the number one smartphone sales spot, having been a distant fourth just two years ago. This change is viewed as indicative of a broader change in the way smartphones are being purchased in the US, with the cost of devices increasingly uncoupled from the service contracts and, if needed, paid for via conventional financing arrangements.

    The US market has undergone significant shifts in the power of the different sales channels with the move to unsubsidized plans,” said Neil Shah of Counterpoint. “The growth of T-Mobile through its different ‘Uncarrier’ moves, the removal of subsidies and enticing subscribers with ‘Simple Choice’ & ‘Jump’ plans, has helped the operator to become the top smartphone sales channel in the USA.

    Samsung and Apple together captured almost two-thirds of the total smartphone shipments share at T-Mobile, with Samsung leading. However, it will be an uphill task for T-Mobile to maintain this lead ahead of Verizon and continue to attract millions of subscribers to its network. The move to unsubsidized and unlocked has also boosted demand in the open channel, which continued to contribute close to 10% of the total shipments in Q1 2016.”

    Conterpoint US smartphones slide 2

    US smartphone sales on the whole declined by 4% year-on-year due to the maturity of the market (most people already have a smartphone) and a lengthening on the upgrade cycle. The latter factor will be a direct result of the shift in buying habits as fewer consumers are being prompted to upgrade their subsidized phones by the renewal of their postpaid contracts.

    “The US market decelerated due to softness in Apple iPhone demand and iPhone SE demand not materializing until Q2 2016,” said Jeff Fieldhack of Counterpoint. “Carriers continued to push subscribers to non-subsidy plans as for the first time more than half of the combined subscriber base of the top four carriers are now on non-subsidized plans. This is a significant shift from the subsidy-driven model just ten to twelve quarters ago. This has changed the basis of competition in US mobile landscape.

    “The focus has shifted to creating more value for the consumer, instead of being device-driven. Unsubsidized device sales have educated consumers that flagship smartphones are costly. This has led to a temporary softness in the device upgrade cycle; the in-carrier upgrade run rate continues to be in 5-6% range per quarter. Handset manufacturers will continue to push hardware and marketing limits to entice subscribers to not defer upgrading.”

     

    Read more »
  • Introduction to Bi-Directional Transceiver Modules

    Almost all modern optical transceivers utilize two fibers to transmit data between switches, firewalls, servers, routers, etc. The first fiber is dedicated to receiving data from networking equipment, and the second fiber is dedicating to transmitting data to the networking equipment. But there is a type of fiber optic transceiver module called BiDi (Bi-Directional) transceiver to break this rule. What's BiDi transceiver? How does it work? And why people believe it will have broad market prospect? This tutorial will give you the answer.

    What's BiDi Transceiver?

    BiDi transceiver is a type of fiber optic transceivers which is used WDM (Wavelength Division Multiplexing) Bi-directional transmission technology so that it can achieve the transmission of optical channels on a fiber propagating simultaneously in both directions. BiDi transceiver is only with one port which uses an integral bidirectional coupler to transmit and receive signals over a single fiber optical cable. Thus, it must be used in pairs.

    How Does BiDi Transceiver Work

    The primary difference between BiDi transceivers and traditional two-fiber fiber optic transceivers is that BiDi transceivers are fitted with Wavelength Division Multiplexing (WDM) couplers, also known as diplexers, which combine and separate data transmitted over a single fiber based on the wavelengths of the light. For this reason, BiDi transceivers are also referred to as WDM transceivers.

    To work effectively, BiDi transceivers must be deployed in matched pairs, with their diplexers tuned to match the expected wavelength of the transmitter and receiver that they will be transmitting data from or to.

    For example, if paired BiDi transceivers are being used to connect Device A (Upstream) and Device B (Downstream), as shown in the figure below, then:

    Transceiver A's diplexer must have a receiving wavelength of 1550nm and a transmit wavelength of 1310nmTransceiver B's diplexer must have a receiving wavelength of 1310nm and a transmit wavelength of 1550nm
    Diplexers at Work in BiDi Optical Ethernet Transceivers

    Advantages of BiDi Transceivers

    The obvious advantage of utilizing BiDi transceivers, such as SFP+- BiDi and SFP-BiDi transceivers, is the reduction in fiber cabling infrastructure costs by reducing the number of fiber patch panel ports, reducing the amount of tray space dedicated to fiber management, and requiring less fiber cable.

    While BiDi transceivers (a.k.a. WDM transceivers) cost more to initially purchase than traditional two-fiber transceivers, they utilize half the amount of fiber per unit of distance. For many networks, the cost savings of utilizing less fiber is enough to more than offset the higher purchase price of BiDi transceivers.

    Read more »
  • ARM’s new CPU and GPU will power mobile VR in 2017

     

    ARM, the company that designs the processor architectures used in virtually all mobile devices on the market, has used Computex Taipei 2016 to announce new products that it expects to see deployed in high-end phones next year. The Cortex-A73 CPU and Mali-G71 GPU are designed to increase performance and power efficiency, with a particular view to supporting mobile VR.

    ARM says that its Mali line of GPUs are the most widely used in the world, with over 750 million shipped in 2015. The new Mali-G71 is the first to use the company's third-generation architecture, known as Bifrost. The core allows for 50 percent higher graphics performance, 20 percent better power efficiency, and 40 percent more performance per square mm over ARM's previous Mali GPU. With scaling up to 32 shader cores, ARM says the Mali-G71 can match discrete laptop GPUs like Nvidia's GTX 940M. It's also been designed around the specific problems thrown up by VR, supporting features like 4K resolution, a 120Hz refresh rate, and 4ms graphics pipeline latency.

     

    As for CPUs, ARM is announcing the new Cortex-A73 core, which prioritizes power efficiency. It's up to 30 percent more efficient than the previous Cortex-A72 while offering about 1.3 times the level of peak performance, but ARM has also focused on sustained usage — the A73 offers over twice the performance within its power budget, meaning it doesn't need to be as hasty to slow down to save battery life.

     

    arm slide 2

     

    Although ARM architecture dominates the mobile landscape, there's a good chance you won't see these specific products in your 2017 flagship phone. ARM licenses its architecture and cores separately, meaning companies are free to pick and choose what they like. Apple, for example, licenses ARM architecture but now designs its own custom CPU cores (known as Twister in the most recent A9 processor) and uses PowerVR GPU solutions from Imagination Technologies. Samsung, meanwhile, designs some Exynos processor cores but uses them alongside ARM's Cortex cores and Mali GPU in the international Galaxy S7. And Qualcomm reverted to its own custom Kryo CPU cores in the Snapdragon 820 — which powers the US Galaxy S7 — after using Cortex in the 810.

    All of this is to say that you shouldn't take the performance laid out here by ARM as a benchmark for your next phone, because it'll all depend on how the manufacturers choose to implement the technology. But the new Cortex and Mali products do demonstrate that mobile technology continues to advance in terms of power and efficiency, and that it's adapting to new challenges such as VR.

    ARM expects chips to move into production at the end of the year and appear in shipping devices in early 2017.

    Read more »
RSS