Having some hardware mentioned on the site that is supported and ready for use could be helpful if someone wants to try it (say raspberry pi), There are probably people who are worried to will make their computer explode.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
quickstart -> supported machines is right there
Oh my god they rewrote Linux in Rust. Amazing.
No, even better!
Now imagine the new COSMIC desktop environment in Rust on Redox, that would be great
This is already in progress. COSMIC applications are compatible with Redox OS.
This. Is. Brilliant!
You guys are pioneers.
This is VERY important for the future of Linux.
If you dive into it, Linux security is a total mess. You have SELinux, userspace and all that, permission systems and mandatory access control.
And then you have the Kernel, which is (to roughly quote Daniel Micay from some 5yo Reddit comment) "like you imagine systemd, but way worse and completely established". It is a huge set of software written in unsafe C, with complete access over the entire system, no matter if its just some ancient driver, some weird unused filesystem support or whatnot.
The kernel is huge bloat, and even if you dont want to accept it, a big reason is Distros not getting their shit together and working on the same thing. If drivers cant be implemented in userspace, as every distro does that differently and things break, for the sake of unifying everything it gets baked into the Kernel.
"Kernel hardening", as far as I understand it, is mostly just restricting those unneeded features, making it log less critical info, blocking some external manipulation...
But the essence really is that the Linux Kernel isnt something everyone should use. There should be modules for the hardware components, external drivers that are installed along.
I guess Gentoo is right here, but its very inconvenient to use. But having your own custom Kernel, only containing modules you need, would be a start. In the End though seperate drivers are necessary.
If it weren't "written in rust" nobody would give a shit.
So?
I wouldn't say it's inappropriate as there is more and more rust making it into the native kernel. I'll definitely throw this on my Ventoy usb and see if I can get it to boot
It is not Linux, but there is no other good Community I guess.
Redox even works on some hardware! Its made pretty much from scratch, and microkernel means you actually need Drivers afaik
I don't understand the obsession with rust.
From my personal experience I can tell you 2 reasons. The first is that this is the first general purpose language that can be used for all projects. You can use it on the web browser with web assembly, it is good for backend and it also is low level enough to use it for OS development and embedded. Other languages are good only for some thing and really bad for others. The second reason is that it is designed around catching errors at compile time. The error handling and strict typing forces the developer to handle errors. I have to spend more time creating the program but considerably less time finding and fixing bugs.
As much as I want to love Rust, that's not entirely true.
Writing a web API in Rust is a pain. It requires way too much boilerplate for very low level concepts. For example having to deal with all the lifetime crap in a simple CRUD endpoint. I understand why that's necessary, but compared to Python or Java it's just a very large mental load overhead.
You need less and less lifetimes as time goes on. The compiler gets better at inferring them and you could always use the heap if you wanted to or if what you are doing isn't very low level.
I know the evangelists can be somewhat overwhelming, but its popularity is not unwarranted. It's fairly easy to pick up, has an incredibly enthusiastic and welcoming community. People like it because it's incredibly performant, and its memory safe. In terms of DX it's really a joy to work with. It just has a LOT going for it, and the main drawback you'll hear about (difficulty) is really overblown and most devs can pick it up in a matter of months.
The main difficulty I have with Rust (what prevents me from using it), is that the maintainers insist on statically compiling everything. This is fine for small programs, and even large monolithic applications that are not expected to change very often.
But for the machine learning projects I work on, I might want to include a single algorithm from a fairly large library of algorithms. The amount of memory used is not trivial, I am talking about the difference between loading a single algorithm in 50 MB of compiled code for a dynamically loadable library, versus loading the entire 1.5 GB library of algorithms of statically linked code just to use that one algorithm. Then when distributing this code to a few dozen compute nodes, that 50 MB versus 1.5 GB is suddenly a very noticeable difference.
There are other problems with statically linking everything as well, for example, if you want your application to be written in a high-level language like Python, TypeScript, or Lisp, you might want to have a library of Rust code that you can dynamically load into the Python interpreter and establish foreign function bindings to the Rust APIs. But this is not possible with statically linked code.
And as I understand, it is a difficult technical problem to solve. Apparently, in order for Rust to optimize a program and guarantee type safety and performance, it needs the type information in the source code. This type information is not normally stored into the dynamically loadable libraries (the .so
or .dll
files), so if you dynamically load a library into a Rust program its type safety and performance guarantees go out the window. So the Rust compiler developers have chosen to make everything as statically compiled as possible.
This is why I don't see Rust replacing C any time soon. A language like Zig might have a better chance than Rust because it can produce dynamically loadable libraries that are fully ABI compatible with the libraries compiled by C compilers.
Just asking as I don't have that much knowledge about static and dynamic linking: When you link statically my understanding was that the compiler directly integrates the implementations of the directly or indirectly used functions and other symbols into the resulting binary but ignores everything else. Wouldn't that mean that it is either smaller over all or at least as small as a dynamic library + executable? Because the dynamic library obviously has to contain every implementation as it doesn't know about the executables of the system.
So the only way where using static linking results in overall bigger sizes than dynamic linking would be many (at least 2) executables using the same library. And you even said that you only use one algorithm from a big library so static should be way smaller than dynamic even with many executables.
When you meant memory usage then I thought that dynamic libraries would have to be completely preloaded because you can't know which function will be needed in the near future and just in time loading would be way too slow while static binaries will only load what will be needed as I understand it.
It’s a system programming language that isn’t C or C++.
Edit to add: How did Go get on that page? That’s a stretch.