this post was submitted on 24 Nov 2024
1104 points (92.6% liked)

Technology

59599 readers
3395 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] HeartyOfGlass@lemm.ee 142 points 15 hours ago (5 children)

Fuck firewire. Glad it's dead. USB C is the best thing to happen to peripherals since the mouse.

[–] FlyingSquid@lemmy.world 87 points 14 hours ago (17 children)

USB C is the best thing to happen to peripherals since the mouse.

I would agree with you if there were a simple way to tell what the USB-C cable I have in my hand can be used for without knowing beforehand. Otherwise, for example, I don't know whether the USB-C cable will charge my device or not. There should have been a simple way to label them for usage that was baked into the standard. As it is, the concept is terrific, but the execution can be extremely frustrating.

[–] ilinamorato@lemmy.world 9 points 10 hours ago

Buying a basic, no-frills USB-C cable from a reputable tech manufacturer all but guarantees that it'll work for essentially any purpose. Of course the shoddy pack-in cables included with a cheap device purchase won't work well.

I replaced every USB-C-to-C or -A-to-C cable and brick in my house and carry bag with a very low cost Anker cable (except the ones that came with my Google products, those are fine), and now anything charges on any cable.

You wouldn't say that a razor sucked just because the cheap replacement blades you bought at the dollar store nicked your face, or that a pan was too confusing because the dog food you cooked in it didn't taste good. So too it is not the fault of USB-C that poorly manufactured charging bricks and cables exist. The standard still works; in fact, it works so well that unethical companies are flooding the market with crap.

[–] HeartyOfGlass@lemm.ee 45 points 14 hours ago (1 children)

Hey that's a fair point. Funny how often good ideas are kneecapped by crap executions.

[–] NobodyElse@sh.itjust.works 35 points 13 hours ago (1 children)

I’m pretty sure the phrase “kneecapped by crap executions” is in the USB working groups’s charter. It’s like one of their core guiding principles.

[–] db2@lemmy.world 19 points 13 hours ago (2 children)

If anyone disagrees with this, the original USB spec was for a reversible connector and the only reason we didn't get to have that the whole time was because they wanted to increase profit margins.

[–] Excrubulent@slrpnk.net 17 points 13 hours ago (1 children)

USB has always been reversible. In fact you have to reverse it at least 3 times before it'll FUCKING PLUG IN.

[–] disguy_ovahea@lemmy.world 4 points 12 hours ago* (last edited 7 hours ago) (1 children)

That’s the reason Apple released the Lightning connector. They pushed for several features for USB around 2010, including a reversible connector, but the USB-IF refused. Apple wanted USB-C, but couldn’t wait for the USB-IF to come to an agreement so they could replace the dated 20-pin connector.

[–] wolfpack86@lemmy.world 1 points 6 hours ago

I'm sure they were mortified they needed to release a proprietary connector

[–] rumba@lemmy.zip 10 points 13 hours ago (5 children)

Burn all the USBC cables with fire except PD. The top PD cable does everything the lower cable does.

[–] Janovich@lemmy.world 10 points 12 hours ago

IDK I’ve had PD cables that looked good for a while but turns out their data rate was basically USB2. It seems no matter what rule of thumb I try there are always weird caveats.

No, I’m not bitter, why would you ask that?

[–] ripcord@lemmy.world 9 points 12 hours ago (1 children)

There are many PD cables that are bad for doing data.

[–] disguy_ovahea@lemmy.world 7 points 12 hours ago* (last edited 9 hours ago) (1 children)

Correct. The other commenter is giving bad advice.

Both power delivery and bandwidth are backwards compatible, but they are independent specifications on USB-C cables. You can even get PD capable USB-C cables that don’t transmit data at all.

Also, that’s not true for Thunderbolt cables. Each of the 5 versions have specific data and power delivery minimum and maximum specifications.

[–] GamingChairModel@lemmy.world 1 points 4 hours ago (1 children)

You can even get PD capable USB-C cables that don’t transmit data at all.

I don't think this is right. The PD standard requires the negotiation of which side is the source and which is the sink, and the voltage/amperage, over those data links. So it has to at least support the bare minimum data transmission in order for PD to work.

[–] disguy_ovahea@lemmy.world 2 points 4 hours ago* (last edited 3 hours ago) (1 children)

Technically, yes, data must transmit to negotiate, but it doesn’t require high throughput. So you’ll get USB 2.0 transfer speeds (480 Mb/s) with most “charging only” USB-C cables. That’s only really useful for a keyboard or mouse these days.

[–] GamingChairModel@lemmy.world 1 points 2 hours ago

This limitation comes up sometimes when people try to build out a zero-trust cable where they can get a charge but not necessarily transfer data to or from an untrusted device on the other side.

load more comments (3 replies)
load more comments (14 replies)
[–] viking@infosec.pub 49 points 15 hours ago (16 children)

I agree with USB-C, but there are still a million USB-A devices I need to use, and I can't be bothered to buy adapters for all of them. And a USB hub is annoying.

Plus, having 1-2 USB-C ports only is never gonna be enough. If they are serious about it, why not have 5?

[–] HeartyOfGlass@lemm.ee 22 points 14 hours ago

Yeah, I'd love at least one USB A type cause most of the peripherals I own use that.

load more comments (15 replies)
[–] jerkface@lemmy.ca 10 points 14 hours ago (3 children)

I hated when mice became the primary interface to computers, and I still do.

[–] Infomatics90@lemmy.ca 8 points 13 hours ago (1 children)

tell me you use i3 without telling me you use i3

load more comments (1 replies)
[–] EatATaco@lemm.ee 7 points 13 hours ago (7 children)
[–] jerkface@lemmy.ca 16 points 13 hours ago* (last edited 13 hours ago) (36 children)

Even for like 20 years after mousing became the primary interface, you could still navigate much faster using keyboard shortcuts / accelerator keys. Application designers no longer consider that feature. Now you are obliged to constantly take your fingers off home position, find the mouse, move it 3cm, aim it carefully, click, and move your hand back to home position, an operation taking a couple of seconds or more, when the equivalent keyboard commands could have been issued in a couple hundred milliseconds.

[–] Wav_function@lemmy.world 20 points 13 hours ago (2 children)

I love how deeply nerdy Lemmy is. I'm a bit of a nerd but I'm not "mice were a mistake" nerd.

[–] sugar_in_your_tea@sh.itjust.works 7 points 12 hours ago (14 children)

I don't think mice were a mistake, but they're worse for most of the tasks I do. I'm a software engineer and I suck at art, so I just need to write, compile, and test code.

There are some things a mouse is way better for:

  • drawing (well, a drawing tablet is better)
  • 3d modeling
  • editing photos
  • first person shooters (KB works fine for OG Doom though)
  • bulk file operations (a decent KB interface could work though)

But for almost everything else, I prefer a keyboard.

And while we're on a tangent, I hate WASD, why shift my fingers over from the normal home row position? It should be ESDF, which feels way more natural...

load more comments (14 replies)
[–] jerkface@lemmy.ca 6 points 13 hours ago* (last edited 12 hours ago)

It's also an age thing. My visual processing is getting worse and worse. My disorientation facing a busy screen with literally thousands of objects that can be interacted with by mouse is a cognitive drain compared to a textual interface where I do most of the work abstractly without having to use visual processing at all. Like reading a book vs watching a movie.

I probably have a lot more experience using pre-mouse era computers than most people. It's like being asked to start using a different language when you are 20. Yeah, you'll become perfectly fluent for a couple decades... but you'll also lose that language first when you get old.

I have noticed that millenials navigate multilayer mouse interfaces (like going down a few chained drop down menus) way faster than I ever did. And zoomers use touch screen keyboards almost as well as I ever touchtyped. Brains are only plastic to a degree, and it just plain feels good to use all those neurons that you first laid down when you were young and your mind was infinite.

[–] Agent641@lemmy.world 11 points 13 hours ago (4 children)

I just use a mouse to type in stuff using the on screen keyboard. It's annoying having to take the ball out and clean it, but you get used to it.

[–] rottingleaf@lemmy.world 3 points 11 hours ago (1 children)

That functionality (first necessary, then required by guidelines, then expected, and then still usual) disciplined UI designers to make things doable in a clear sequence of actions.

Now they think any ape can make a UI if it knows the new shiny buzzwords like "material design" or "air" or whatever. And they do! Except humans can't use those UIs.

BTW, about their "air". One can look at ancient UI paradigms, specifically SunView, OpenLook and Motif (I'm currently excited about Sun history again), Windows 3.*, and also Win9x (with WinXP being more or less inside the same paradigm). And one can see that of these only Motif had anything resembling their "air". And Motif is generally considered clunky and less usable than the rest of the mentioned (I personally consider OpenLook the best), but compared to modern UIs even Motif does that "air" part the way it seems to make some sense, and feels less clunky, making me wonder how is that even possible.

FFS, modern UI designers don't even think it's necessary to clearly and consistently separate buttons and links from text.

And also - freedom in Web and UI design has proven to be a mistake. UIs should be native. Web browsers should display pages adaptively (we have such and such blocks of text and such and such links), their appearance should be decided on the client and be native too, except pictures. Gemini is the right way to go for the Web.

load more comments (1 replies)
load more comments (33 replies)
load more comments (6 replies)
load more comments (1 replies)
load more comments (2 replies)