this post was submitted on 21 Dec 2023
2 points (100.0% liked)

Linux

48054 readers
683 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

This isn't Linux, but Linux-like. Its a microkernel built from the rust programming language. Its still experimental, but I think it has great potential. It has a GUI desktop, but the compiler isn't quite fully working yet.

Has anyone used this before? What was your experience with it?

Note: If this is inappropriate since this isn't technically Linux, mods please take down.

you are viewing a single comment's thread
view the rest of the comments
[–] cashews_best_nut@lemmy.world 0 points 10 months ago (14 children)

I don't understand the obsession with rust.

[–] MonkCanatella@sh.itjust.works 0 points 10 months ago (2 children)

I know the evangelists can be somewhat overwhelming, but its popularity is not unwarranted. It's fairly easy to pick up, has an incredibly enthusiastic and welcoming community. People like it because it's incredibly performant, and its memory safe. In terms of DX it's really a joy to work with. It just has a LOT going for it, and the main drawback you'll hear about (difficulty) is really overblown and most devs can pick it up in a matter of months.

[–] Ramin_HAL9001@lemmy.ml 0 points 10 months ago (1 children)

The main difficulty I have with Rust (what prevents me from using it), is that the maintainers insist on statically compiling everything. This is fine for small programs, and even large monolithic applications that are not expected to change very often.

But for the machine learning projects I work on, I might want to include a single algorithm from a fairly large library of algorithms. The amount of memory used is not trivial, I am talking about the difference between loading a single algorithm in 50 MB of compiled code for a dynamically loadable library, versus loading the entire 1.5 GB library of algorithms of statically linked code just to use that one algorithm. Then when distributing this code to a few dozen compute nodes, that 50 MB versus 1.5 GB is suddenly a very noticeable difference.

There are other problems with statically linking everything as well, for example, if you want your application to be written in a high-level language like Python, TypeScript, or Lisp, you might want to have a library of Rust code that you can dynamically load into the Python interpreter and establish foreign function bindings to the Rust APIs. But this is not possible with statically linked code.

And as I understand, it is a difficult technical problem to solve. Apparently, in order for Rust to optimize a program and guarantee type safety and performance, it needs the type information in the source code. This type information is not normally stored into the dynamically loadable libraries (the .so or .dll files), so if you dynamically load a library into a Rust program its type safety and performance guarantees go out the window. So the Rust compiler developers have chosen to make everything as statically compiled as possible.

This is why I don't see Rust replacing C any time soon. A language like Zig might have a better chance than Rust because it can produce dynamically loadable libraries that are fully ABI compatible with the libraries compiled by C compilers.

[–] naptera@feddit.de 0 points 10 months ago

Just asking as I don't have that much knowledge about static and dynamic linking: When you link statically my understanding was that the compiler directly integrates the implementations of the directly or indirectly used functions and other symbols into the resulting binary but ignores everything else. Wouldn't that mean that it is either smaller over all or at least as small as a dynamic library + executable? Because the dynamic library obviously has to contain every implementation as it doesn't know about the executables of the system.

So the only way where using static linking results in overall bigger sizes than dynamic linking would be many (at least 2) executables using the same library. And you even said that you only use one algorithm from a big library so static should be way smaller than dynamic even with many executables.

When you meant memory usage then I thought that dynamic libraries would have to be completely preloaded because you can't know which function will be needed in the near future and just in time loading would be way too slow while static binaries will only load what will be needed as I understand it.

load more comments (11 replies)