152
"GitHub" Is Starting to Feel Like Legacy Software - The Future Is Now
(www.mistys-internet.website)
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Community icon from opensource.org, but we are not affiliated with them.
Fediverse version of github when? Unless it already exists?
It's called git. It's been distributed from day 1. GitHub was an attempt to centralize it.
Yeah... does git have issue tracking? actions? C'mon: it's not like github & co. are just git.
It doesn't have discussions, it doesn't offer pull request management with commented/annotated code reviews, it doesn't have built-in ssh and key management features, no workflows, no authorization tools of any kind...
In short I find the "just use git itself lmao" to be an exceedingly weird thing to say and I find it even weirder that it gets said as often as it does and it gets upvoted so much. Git by itself is not very useful at all if there are more than one a half people working on the same code.
A server hosting a copy of the repo, git send-email, a mailing list and a bugzilla instance is all that an open source project really needs.
The advantage of github/gitlab et al. is that it merges all of the above functionality to one place, however it's not absolutely essential. Git itself is extremely versatile and can be as useful as you are want it to be if you put in the time to learn it.
I love how much spare time you have to learn and maintain your infrastructure unnecessarily instead of working on the code. It's like being a bus driver by day, and mechanic all night.
Depends how interested you are in the infrastructure I suppose. Obviously it's not essential for any project. I see a few that have both self hosted resources and additionally a Github mirror.
An advantage to the "old school" approach is that you don't end up tied into a large SAAS platform like Github.
Again, like OP said, those are typically distinct functionality: issue tracking, source control, deployment etc. GitHub bringing everything into one platform is atypical and obviously done for the goal of centralization. The more stuff you add to a platform the harder it makes it to leave or replicate.
But no, technically speaking you don't need to have all of it in one place. There's no reason for which you must manage everything together.
I don't even understand why people like GitHub so much, its source management sucks. The fact it still doesn't have a decent history visualization to this day is mind-boggling.
Look for ways to do things separately and you will find much better tools. GitHub's "one size fits all" approach is terrible and only holds because people are too lazy to look for any alternative.
I agree with this part.
Perhaps this is part of the answer to why people like github. Unlike you, most people love all-in-one tools. I once suggested a bunch of offline tools to use with git, with much better user experience than github. The other person was like, "Yeah, no! I don't want to learn that many tools".
The advantage of a centralized app is that all the services you mentioned are integrated well with each other. The distinct and often offline tools often have poor integration with each other. This is harder to achieve in such tools, compared to centralized hosts. The minimum you need to start with is a bunch of standards for all these tools to follow, so that interoperability is possible later.
It's not that complicated... people use it because everyone has an account there and so your project gets more visibility (and your profile too, for those who plan to flex it when they look for the next job) and more contributions. Even a lot of projects that aren't on github have some sort of mirror there for visibility.
Suppose you wanna contribute to gnu grep (or whatever)... do you happen to know off the top of your head where the source repo and bug tracker are? And do you know what's the procedure to submit your patch?
If you are a company doing closed source, I agree that I don't see why you would choose github over the myriad alternatives (including the self hosted ones).
That's a great way to spend your resources developing yet-another-source-forge-thingie instead of whatever your actual project/product is supposed to be :)
But you don't have to develop anything. There are plenty of ready-made excellent tools you can just drop-in. The main fallacy is that what Github does is actually useful, or that the pieces it integrates are useful. 90% of Github is subpar for any given purpose. Consider all the possible types of software being developed and all the different release flows and support/issue flows, how could they possibly be shoehorned into a one-size-fits-all? Yet people try their damnest to do exactly that.
To do software development you need (A) issue tracking, (B) a clear release flow, and (C) a deploy mechanism that's easy to use. A is a drop-in tool with lots of alternatives, B is unrestricted since Git is very flexible in this regard, and C is typically included with any cloud infrastructure, unless you're doing on premise in which case there are also drop-in tools.
A, B, C are three distinct, orthogonal topics that can and should be handled separately. There's no logical reason to shape any of them after the other. They have to work together, sure, but the design considerations of one must not affect the others.
I interpreted your "look for ways to do things separately" as "look for separate tools that do the various things" (and you have to integrate), but I see now that you meant "look for ways to do things differently". My bad.
I used gerrit and zuul a while back at a place that really didn't want to use GitHub. It worked pretty well but it took a lot of care and maintenance to keep it all ticking along for a bunch of us.
It has a few features I loved that GitHub took years to catch up to. Not sure there's a moral to this story.
What combination would you recommend to replace most common GitHub functionality?
It depends a lot on the setup you have, how many people, release flow etc. Issue tracking depends on the kind of software you do and whether you want a programmer-only flow or a full support flow.
Deploy pipelines will usually depend on the infrastructure, cloud solutions usually can integrate with several and there's also common solutions and even FOSS ones, like Terraform vs OpenTofu.
Git frontends are a very mixed bag, generally speaking their main purpose is to hide Git as much as possible and allow programmers to contribute changes upstream without knowing much beyond the nebulous "PR" concept. Basically they're mostly useless other than enabling people to remain dumb. A good Git tutorial and a good history visualization tool (git happens to include one called
gitk
out of the box) will do so much more to teach people Git, and there's really no substitute for communication – using annotations to discuss pros and cons for a PR is badly inadequate.Forgejo should work
Forgejo is what you're wanting
That seems to be it. I didn't know that existed.
I'm glad I get to introduce you to it! The biggest instance is Codeberg. Fediverse integration isn't there yet but the general consensus is its coming very soon since that's Codeberg's main focus for the forgejo project right now
https://forgejo.org/faq/#is-there-a-roadmap-for-forgejo
Git is already decentralized
They're asking for a federated forge, not decentralized VCS.
I should be able to log into my own instance and use that account to open a bug report with your project, for example.
Forgejo is working on that, but it's not there yet.
Github is more than just git. We need decentralized solutions for associated services and persistently online repos.
Gitlab and forgejo
Something like radicle?
https://radicle.xyz/
Piping
curl
intosh
in install instructions is a fast track to me not taking a project seriouslyYeah, like Lemmy
Excited for Sublinks...
I've heard this over and over... what's the difference security-wise between sudo running some install script and sudo installing a .deb (or whatever package format) ?
@gomp try comparing it with
apt install
, not with downloading a .deb file from a random website - that is obviously also very insecure. But the main thingcurl|sh
will never have is verifying the signature of the downloaded file - what if the server got compromised, and someone simply replaced it. You want to make sure that it comes from the actual author (you still need to trust the author, but that's a given, since you are running their code). Even a signed tarball is better than curl|sh.Installing a .deb is what I was thinking about.
If you have a pre-shared trusted signature to check against (like with your distro's repos), yes. But... that's obviously not the case since we are talking installing software from the developer's website.
Whatever cryptografic signature you can get from the same potentially compromised website you get the software from would be worth as much as the usual md5/sha checksums (ie. it would only check against transmission errors).
@gomp Why would you be taking the signature from the same website? Ever heard of PGP key servers?
That would be "a pre-shared trusted signature to check against", and is seldom available (in the real world where people live - yes, there are imaginary/ideal worlds where PGP is widespread and widely used) :)
@gomp You mean, as seldom available as every
apt install
ever? https://superuser.com/a/990153My bad for causing confusion: when I wrote "trusted signature" I should have said "trusted public key".
The signatures in an apt repo need to be verified with some public key (you can think of signatures as hashes encrypted with some private key).
For the software you install from your distro's "official" repo, that key came with the .iso back when you installed your system with (it may have been updated afterwards, but that's beyond the point here).
When you install from third-party repos, you have to manually trust the key (IIRC in Ubuntu it's something like
curl <some-url> | sudo apt-key add -
?). So, this key must be pre-shared (you usually get it from the dev's website) and trusted.@gomp Yes but the point is that it comes from a different place and a different time, so for you to execute a compromised program, it would have to be compromised for a prolonged time without anyone else noticing. You are protected by the crowd. In curl|sh you are not protected from this at all
A deb is just a zip file that gets unpacked to where your binaries go. A shell script you curl pipe into shell could contain literally any instructions
Binary packages have scripts (IIRC for .deb they are preinst/postinst to be run before/after installation and prerm/postrm before/after removal) that are run as root.
BTW the "unzip" part is also run as root, and a binary package can typically place stuff anywhere in your system (that's their job after all)... even if you used literal zip files they could still install a script in ways that would cause the OS to execute it.
Yeah I'm over simplifying on purpose here. The bottom line is piping into
sh
is dangerousJust install it manually via cargo then.
I once heard of torrent git
I've read that GitLab is experimenting with the concept.