this post was submitted on 28 Nov 2024
54 points (90.9% liked)

Privacy

32177 readers
403 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

I've been play around with ollama. Given you download the model, can you trust it isn't sending telemetry?

you are viewing a single comment's thread
view the rest of the comments
[–] marcie@lemmy.ml 24 points 1 day ago (1 children)

you can check the process to see if its communicating at all. none of the big ones do. its possible someone could be fucking with the file though, before the safetensors format this was a big issue, and still sort of is afterwards. only DL from reputable sources

[–] JustJack23@slrpnk.net 2 points 1 day ago (2 children)

Can't you run if from a container? I guess the will slow it down, but it will deny access to your files.

[–] acockworkorange@mander.xyz 7 points 1 day ago (1 children)

Containers don’t really slow down apps significantly. It’s not a VM, it’s still a native app running in your kernel, just on a separate memory space and restricted access to hardware.

[–] JustJack23@slrpnk.net 0 points 1 day ago (1 children)

That is true for Linux and maybe Mac, but on windows I think they have a bit more overhead. But again I agree that in most cases it is not significant.

[–] acockworkorange@mander.xyz 4 points 1 day ago* (last edited 1 day ago)

Is the overhead because of containers or is it because you’re running something that is meant to run on Linux and is using a conversion layer like MinGW ?

[–] marcie@lemmy.ml 9 points 1 day ago (1 children)

yeah you could. though i dont see any evidence that the large open source llm programs like jan.ai or ollama are doing anything wrong with their program or files. chucking it in a sandbox would solve the problem for good though

[–] SeekPie@lemm.ee 5 points 1 day ago* (last edited 1 day ago)

You could use "Alpaca" flatpak and remove the internet access with flatseal after having downloaded the model. (Linux)

Or deny the app's access to internet in app settings. (Android)