• thatKamGuy@sh.itjust.works
    link
    fedilink
    arrow-up
    8
    ·
    13 days ago

    It’s been more of a pain in the arse than initially expected.

    Most motherboards (for example) only have 2-4 USB-C ports, meaning that I still need to employ A-C and C-C cables for peripherals etc.

    My main gripe is that the standard just tries to do too many things without clear delineation/markings:

    1. Is it a USB 2.0 (480Mbit), 5Gbit, 10Gbit or 20Gbit cable? Can’t really tell from the plug alone.

    2. More importantly, for charging devices: How the heck do I determine maximum wattage I can run?

    For all its faults, at least the blue colour of a USB-3.0 plug (or additional connectors for B/Micro) made it easy to differentiate !

    Now I’m eyeing up a USB Cable tester just to validate and catalogue my growing collection! 🤦🏻‍♂️

    • UltraHamster64@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      13 days ago

      It’s even more annoying that there are different possible pinouts in the port itself without clear labling. So always use the one cable that came with the peripheral, or you have a chance to fry it

      • thatKamGuy@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        13 days ago

        I was actually thinking coloured O rings to define specs, but that still means I’d need to have a colours guide somewhere too…

        …yours might be a more practical solution. 🤔

    • tetris11@lemmy.mlOP
      link
      fedilink
      arrow-up
      3
      ·
      13 days ago

      I wonder about this too. Can I plug my laptop’s USB-C charger into my phone? Or is that a big nono

      • thatKamGuy@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        13 days ago

        Yes, you can. The charger and the device communicate between one another what they can support, and pick the highest one they both agree on.

        E.G. my laptop charger can charge at full speed (100W) for my MacBook, but only at 20W for my iPhone.

        That bit is pretty straightforward and transparent to end users (there are a few rare conditions where devices might not agree on the fastest, and have to fall back to a slower one); the issue is more with cables not having sufficient gauge wire, or missing connections that prevent the charger and device from communicating their full functionality.